2012 - miCroInfOworlD.com
Headlines News :

How To Capture And Save Screenshots Using Snagit 10

Written By KAJANTHAN JS on Saturday 7 January 2012 | 16:37


If you love pictures and you want to capture desktop screenshot, now it’s time to break [Print Screen] habit. The[Print Screen] and [Alt Print Screen] methods are satisfactory for capturing screenshot, but it can capture only one-shot at a time, and you can’t use them for capturing more than one screenshot. In this tutorial, I am going to tell you how to capture and save screenshots using Snagit 10. Just follow the following straight forward steps.
Step 1: Launch ‘Snagit 10’
Step 2: Click on ‘Full Screen’ option which is located in the right ‘Profiles’ panel
How To Capture And Save Screenshots Using Snagit 10
Step 3: Click on ‘Capture’ red button which is located in the right bottom ‘Profile Settings’ panel
How To Capture And Save Screenshots Using Snagit 10
Step 4: Click on ‘Snagit’ button, and then click on ‘Save As’ -> ‘Standard Format’ option
How To Capture And Save Screenshots Using Snagit 10
How To Capture And Save Screenshots Using Snagit 10
Step 5: Select a desire folder in which you want to save your image file and then click on ‘Save’ button
How To Capture And Save Screenshots Using Snagit 10
You have to click on ‘Capture’ button every time whenever you want to capture a screenshot of your desktop. I hope this effortless tutorial help you to capture screenshots of your desktop.

How To Enable Turbo Boost Gaming Mode Using Advanced SystemCare 5


Turbo Boost helps you to optimize and speed up your computer. This software boosts your computer much deeply for gaming by terminating more unnecessary Systems and Non-Windows services. I am going to tell you how to enable turbo boost gaming mode using Advanced SystemCare 5. Just follow the following simple steps.
Step 1: Launch ‘Advanced SystemCare’ (Start -> All Programs -> Advanced SystemCare 5 -> Turbo Boost)
Step 2: Select ‘Game Mode’ and click on ‘Next’ button.
How To Enable Turbo Boost Gaming Mode Using Advanced SystemCare 5
Step 3: Select the unnecessary system services from the ‘Item Name’ list, and then click on ‘Next’ button.
How To Enable Turbo Boost Gaming Mode Using Advanced SystemCare 5
Step 4: Select the unnecessary non-Windows services from the ‘Item Name’ list, and then click on ‘Next’ button.
How To Enable Turbo Boost Gaming Mode Using Advanced SystemCare 5
Step 5: Select the unnecessary background applications from the ‘Item Name’ list, and then click on ‘Next’ button.
How To Enable Turbo Boost Gaming Mode Using Advanced SystemCare 5
Step 6: If you want to disable your Window theme select ‘Yes, make it like this’ option, else ‘No, keep current appearance’ option, and then click on ‘Next’ button.
How To Enable Turbo Boost Gaming Mode Using Advanced SystemCare 5
Step 7: If you want to enable these setting at start up, select ‘Would you like to turn on Turbo Boost at Windows start up’ and click on ‘Finish’ button, else click on ‘Finish’ button.
How To Enable Turbo Boost Gaming Mode Using Advanced SystemCare 5
Now your computer is more optimize and more speedy for gaming. Click here to download this software free trial version.

How To Uninstall Programs Using CCleaner


CCleaner is one the perfect software for cleaning your Windows PC. It can uninstall programs just like the Add/Remove Program of Windows. You can use it to uninstall programs which are no longer useful for you. You just need follow the following steps.
Step 1: Launch ‘CCleaner’ (Start -> All Programs -> CCleaner -> CCleaner)
Step 2: Click on ‘Tools’, and then click on ‘Uninstall’ tab
How To Uninstall Programs Using CCleaner
Step 3: Select a program from list you want to remove from your computer, and then click on ‘Run Uninstaller’ button at the right, and then you can uninstall this program from your computer.
How To Uninstall Programs Using CCleaner
Sometimes, you may find applications that have no uninstall option, this will help you to uninstall those programs as well.

How To Use RAM Smartly Using Advanced SystemCare 5


Today I am going to tell you, how to increase the performance of your computer and free up unused memory from RAM. IObit Smart RAM monitors memory usage and recycles unused memory blocks to increase available physical memory. Just follow the following steps to boost the performance of your computer.
Step 1: Launch ‘Advanced SystemCare 5’ (Start -> All Programs -> Advanced SystemCare 5 -> Advanced SystemCare 5)
Step 2: Click on ‘Smart RAM’ which is located in ‘Toolbox’ tab
 How To Use RAM Smartly Using Advanced SystemCare 5
Step 3: Click on ‘Go’ and then ‘Smart Clean’ it will remove unused memory from the RAM, and if you want to recycle unused memory blocks in deep then click on ‘Go’ and then ‘Deep Clean’ it will clean memory in deep.
 How To Use RAM Smartly Using Advanced SystemCare 5
 How To Use RAM Smartly Using Advanced SystemCare 5
 How To Use RAM Smartly Using Advanced SystemCare 5
 How To Use RAM Smartly Using Advanced SystemCare 5
I hope this effortless 3 steps tutorial help you a lot to increase and enhance the performance of your computer. Don’t forget to share this tutorial within your social circle.

Help protect your PC with Microsoft Security Essentials

Microsoft Security Essentials is a new, free consumer anti-malware solution for your computer. It helps protect against viruses, spyware, and other malicious software. It's available as a no-cost download for Windows XP SP2 and higher, Windows Vista, and Windows 7.

Why should I download Microsoft Security Essentials?

  • Comprehensive protection—Microsoft Security Essentials helps defend your computer against spyware, viruses, worms, Trojans, and other malicious software.
  • Easy to get, easy to use—Because Microsoft Security Essentials is available at no cost, there's no registration process that requires billing or personal information collection. It installs after a quick download and Genuine Windows validation and then stays automatically up-to-date with the latest protection technology and signature updates.
  • Quiet Protection—Microsoft Security Essentials doesn't get in your way. It runs quietly in the background and schedules a scan when your computer is most likely idle. You only see alerts when you need to take action.

Microsoft Security Essentials security status

Microsoft Security Essentials has a clean, simple home page that shows the security state of your computer.
A green icon means that the security status of your computer is good
A green icon means that the security status of your computer is good. Microsoft Security Essentials is up -to- date and is running in the background to help protect your computer against malware and other malicious threats. When your computer has an issue that requires your attention, the look of the Microsoft Security Essentials home page changes based on the issue. The status pane turns either yellow or red depending on the situation, and an action button appears in a prominent location on the page with the suggested action.
A yellow icon means that status is fair or potentially unprotected and that you should take some action
A yellow icon means that status is fair or potentially unprotected and that you should take some action, such as turning on real-time protection, running a system scan, or addressing a medium-severity or low-severity threat.
A red icon means that your computer is at risk and that you must address a high- or severe-level threat to protect it
A red icon means that your computer is at risk and that you must address a severe threat to protect it. Click the button to take the recommended action and Microsoft Security Essentials will clean the detected file and then do a quick scan for additional malicious software.
For more product information and to download the product, visit Microsoft Security Essentials.

Technological Healing


More data: Technologies aimed at improving our health are proliferating. One example is this device that plugs into an iPhone to turn it into a glucose monitor. Credit: Patrik Stollarz/AFP/Getty Images
Nanosensors patrolling your bloodstream for the first sign of an imminent stroke or heart attack, releasing anticlotting or anti-inflammatory drugs to stop it in its tracks. Cell phones that display your vital signs and take ultrasound images of your heart or abdomen. Genetic scans of malignant cells that match your cancer to the most effective treatment.
In cardiologist Eric Topol's vision, medicine is on the verge of an overhaul akin to the one that digital technology has brought to everything from how we communicate to how we locate a pizza parlor. Until now, he writes in his upcoming book The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care, the "ossified" and "sclerotic" nature of medicine has left health "largely unaffected, insulated, and almost compartmentalized from [the] digital revolution." But that, he argues, is about to change.
Digital technologies, he foresees, can bring us true prevention (courtesy of those nanosensors that stop an incipient heart attack), individualized care (thanks to DNA analyses that match patients to effective drugs), cost savings (by giving patients only those drugs that help them), and a reduction in medical errors (because of electronic health records, or EHRs). Virtual house calls and remote monitoring could replace most doctor visits and even many hospitalizations. Topol, the director of the Scripps Translational Science Institute, is far from alone: e-health is so widely favored that the 2010 U.S. health-care reform act allocates billions of dollars to electronic health records in the belief that they will improve care.
Anyone who has ever been sick or who is likely to ever get sick—in other words, all of us—would say, Bring it on. There is only one problem: the paucity of evidence that these technologies benefit patients. Topol is not unaware of that. The eminently readable Creative Destruction almost seems to have two authors, one of them a rigorous, hard-nosed physician/researcher who insightfully critiques the tendency to base treatments on what is effective for the average patient. This Topol cites study after study showing that much of what he celebrates may not benefit many individual patients at all. The other author, however, is a kid in the electronics store whose eyes light up at every cool new toy. He seems to dismiss the other Topol as a skunk at a picnic.
Much of the enthusiasm for bringing the information revolution to medicine reflects the assumption that more information means better health care. Actual data offer reasons for caution, if not skepticism. Take telemonitoring, in which today's mobile apps and tomorrow's nanosensors would measure blood pressure, respiration, blood glucose, cholesterol, and other physiological indicators. "Previously, we've been able to assess people's health status when they came in to a doctor's office, but mobile and wireless technology allow us to monitor and track important health indicators throughout the day, and get alerts before something gets too bad," says William Riley, program director at the National Heart, Lung & Blood Institute and chairman of a mobile health interest group at the National Institutes of Health. "Soon there won't be much that we can't monitor remotely."
Certainly, it is worthwhile to monitor blood pressure, glucose, and other indicators; if nothing else, having regular access to such data might help people make better choices about their health. But does turning the flow of data into a deluge lead to better results on a large scale? The evidence is mixed. In a 2010 study of 480 patients, telemonitoring of hypertension led to larger reductions in blood pressure than did standard care. And a 2008 study found that using messaging devices and occasional teleconferencing to monitor patients with chronic conditions such as diabetes and heart disease reduced hospital admissions by 19 percent. But a 2010 study of 1,653 patients hospitalized for heart failure concluded that "telemonitoring did not improve outcomes." Similarly, a recent review of randomized studies of mobile apps for smoking cessation found that they helped in the short term, but that there is insufficient research to determine the long-term benefits. Given the land rush into mobile health technologies, or "m-health," the lack of data on their helpfulness raises concerns. "People are putting out systems and technologies that haven't been studied," says Riley.
These concerns also apply to technologies we don't have yet, like those nanosensors in our blood. For instance, studies have reached conflicting conclusions about whether diabetics benefit from aggressive glucose control—something that could be provided by nanosensors paired with insulin delivery devices. Several studies have found that it can lead to hypoglycemia (dangerously low levels of blood glucose) and does not reduce mortality in severely ill diabetics. And sensors may be no better at detecting incipient cancers or heart attacks. If the ongoing debate about overdiagnosis of breast and prostate cancer has taught us anything, it should be that an abnormality that looks like cancer might not spread or do harm, and therefore should not necessarily be treated. For heart attacks, we need rigorous clinical trials establishing the rate of false positives and false negatives before we start handing out nanosensors like lollipops.
EHRs also seem like a can't-miss advance: corral a patient's history in easily searched electrons, rather than leaving it scattered in piles of paper with illegible scribbles, and you'll reduce medical errors, minimize redundant tests, avoid dangerous drug interactions (the system alerts the prescriber if a new prescription should not be taken with an existing one), and ensure that necessary exams are done (by reminding a physician to, say, test a diabetic's vision).
In practice, however, the track record is mixed. In one widely cited study, scientists led by Jeffrey Linder of Harvard Medical School reported in 2007 that EHRs were not associated with better care in doctor's offices on 14 of 17 quality indicators, including managing common diseases, providing preventive counseling and screening tests, and avoiding potentially inappropriate prescriptions to elderly patients. (Practices that used EHRs did do better at avoiding unnecessary urinalysis tests.) Topol acknowledges that there is no evidence that the use of EHRs reduces diagnostic errors, and he cites several studies that, for instance, found "no consistent association between better quality of care and [EHRs]." Indeed, one disturbing study he describes showed that the rate of patient deaths doubled in the first five months after a hospital computerized its system for ordering drugs.
Financial incentives threaten another piece of Topol's vision. Perhaps the most promising path to personal medicine is pharmacogenomics, or using genetics to identify patients who will—or will not—benefit from a drug. Clearly, the need is huge. Clinical trials have shown that only one or two people out of 100 without prior heart disease benefit from a certain statin, for instance, and one heart attack victim in 100 benefits more from tPA (tissue plasminogen activator, a genetically engineered clot-dissolving drug) than from streptokinase (a cheap, older clot buster). Genetic scans might eventually reveal who those one or two are. Similarly, as Topol notes, only half the patients receiving a $50,000 hepatitis C drug, and half of those taking rheumatoid arthritis drugs that ring up some $14 billion in annual sales, see their health improve on these medications. By preëmptively identifying who's in which half, genomics might keep patients, private insurers, and Medicare from wasting tens of billions of dollars a year.
Yet despite some progress in matching cancer drugs to tumors, pharmacogenomics "has had limited impact on clinical practice," says Joshua Cohen of the Tufts Center for the Study of Drug Development, who led a 2011 study of the field. Several dozen diagnostics are in use to assess whether patients would benefit from a specific drug, he estimates; one of the best-known analyzes breast cancers to see if they are fueled by a mutation in the her2 protein, which means they are treatable with Herceptin. But insurers still doubt the value of most such tests. It's not clear that testing everyone who's about to be prescribed a drug would save money compared with giving it to all those patients and letting the chips fall where they may.
Genotyping is not even routine in clinical trials of experimental cancer drugs. As Tyler Jacks, an MIT cancer researcher, recently told me, companies "run big dumb trials" rather than test drugs specifically on patients whose cancer is driven by the mutation the drug targets. Why? Companies calculate that it is more profitable to test these drugs on many patients, not just those with the mutation in question. That's because although a new drug might help nearly all lung cancer patients with a particular mutation, a research trial might indicate that it helps—just to make up a number—30 percent of lung cancer patients as a whole. Even that less impressive number could be enough for Food and Drug Administration approval to sell the drug to everyone with lung cancer. Limiting the trial to those with the mutation would limit sales to those patients. The risk that the clinical trial will fail is more than balanced by the chance to sell the drug to millions more people.
Such financial considerations are not all that stands in the way of Topol's predictions. He and other enthusiasts need to overcome the lack of evidence that cool gadgets will improve health and save money. But though he acknowledges the "legitimate worry" about adopting technologies before they have been validated, his cheerleading hardly flags. "The ability to digitize any individual's biology, physiology, and anatomy" will "undoubtedly reshape" medicine, he declares, thanks to the "super-convergence of DNA sequencing, mobile smart phones and digital devices, wearable nanosensors, the Internet, [and] cloud computing." Only a fool wouldn't root for such changes, and indeed, that's why Topol wrote the book, he says: to inspire people to demand that medicine enter the 21st century. Yet he may have underestimated how much "destruction" will be required for that goal to be realized.

Worm steals 45,000 Facebook passwords, researchers say


A computer worm has stolen 45,000 login credentials from Facebook, security experts have warned.
The data is believed to have been taken largely from Facebook accounts in the UK and France, according to security firm Seculert.
The culprit is a well-known piece of malware - dubbed Ramnit - which has been around since April 2010 and has previously stolen banking details.
Facebook told the BBC that it was looking into the issue.
The latest iteration of the worm was discovered in Seculert's labs.
"We suspect that the attackers behind Ramnit are using the stolen credentials to login to victims' Facebook accounts and to transmit malicious links to their friends, thereby magnifying the malware's spread even further," said the researchers on the firm's blog.
"In addition, cybercriminals are taking advantage of the fact that users tend to use the same password in various web-based services to gain remote access to corporate networks," it added.
'Viral power'
Social networks offer rich pickings for hackers because of the huge amount of personal data that is stored on them. Increasingly malware is being updated for the social networking age.
"It appears that sophisticated hackers are now experimenting with replacing the old-school email worms with more up-to-date social network worms. As demonstrated by the 45,000 compromised Facebook subscribers, the viral power of social networks can be manipulated to cause considerable damage to individuals and institutions when it is in the wrong hands," said Seculert.
According to Seculert, 800,000 machines were infected with Ramnit from September to the end of December 2011.
Microsoft's Malware Protection Center (MMPC) described Ramnit as "a multi-component malware family which infects Windows executable as well as HTML files... stealing sensitive information such as stored FTP credentials and browser cookies".
In July 2011 a Symantec report estimated that Ramnit worm variants accounted for 17.3% of all new malicious software infections.
For Facebook users concerned that they have been affected by the worm, the advice is to run anti-virus software.
"It won't necessarily be obvious that you have been attacked. The worm is stealing passwords so it is not going to announce itself," said Graham Cluley, senior security consultant at Sophos.
Update - Friday 6 January, 10:22am: Facebook has responded to this article with the following statement:
"Last week we received from external security researchers a set of user credentials that had been harvested by a piece of malware. Our security experts have reviewed the data, and while the majority of the information was out-of-date, we have initiated remedial steps for all affected users to ensure the security of their accounts.
"Thus far, we have not seen the virus propagating on Facebook itself, but have begun working with our external partners to add protections to our anti-virus systems to help users secure their devices. People can protect themselves by never clicking on strange links and reporting any suspicious activity they encounter on Facebook.
"We encourage our users to become fans of the Facebook Security Page for additional security information."

4 hot Microsoft technologies coming in 2012


The year 2012 brings with it prophecies (of doom, unfortunately -- thanks, Mayans), predictions, and promises in every industry. While I'm no seer of the future, there are four tools and toys I'm particularly anticipating from Microsoft this new year, after 2011's relative dearth of tools and toys from the company.
Microsoft technology No. 1: PST Capture toolWith a big push to get the PST Exchange mailbox archives off the desktop, Microsoft has added archiving to Exchange 2010. The reasons for getting rid of PSTs relate to legal compliance and discovery, which is much more complex -- if not downright impossible -- with PSTs out in the wild. The new archive feature in Exchange, although very much appreciated, still doesn't help us easily get the PST captured and imported into the Exchange Store.
[ Windows 8 is coming, and InfoWorld can help you get ready with the Windows 8 Deep Dive PDF special report, which explains Microsoft's bold new direction for Windows, the new Metro interface for tablet and desktop apps, the transition from Windows 7, and more. | Our Windows Server 8 Deep Dive shows you what to expect in the next-gen server version of Windows. | Stay abreast of key Microsoft technologies in our Technology: Microsoft newsletter. ]
Since July 2011, we have been waiting for Microsoft to release its PST Capture; it was supposed to have shipped by now, but it looks as if it will ship by April. It's true that third-party tools already offer similar archiving capabilities, but I prefer to get such tools for free from the Microsoft Exchange team, especially when budgets are tight.
Microsoft technology No. 2: Windows 8 tabletsAfter recently buying my wife an iPad 2 (and secretly admiring it), I keep saying, "Well, sure this is nice, but wait until Windows 8." The response I keep getting, even from some big names in IT: "Pete, it's over. Microsoft lost this one." I refuse to accept that.
Apple has to worry about both the tablet and the OS. Microsoft only has to worry about the OS. The viability of that approach is already proven in the Android market, where tablets enhanced by Samsung and others are challenging the iPad phenomenom. Microsoft has already done most of the work in Windows 8, and its success or failure will come down to three issues:
  • It must appeal to the masses, which means it has to be sleek and iCandy-oriented like the iPad
  • The marketing campaign has to be compelling (in other words, no Jerry Seinfeld and Bill Gates); Microsoft might want to see if Justin Long's contract has run out over at Apple
  • Developers will need to embrace it and create apps for it in much the same way the iPad and Android tablets gained thousands of appsMicrosoft technology No. 3: Windows Server 8
    I've written several articles outlining all the great features (there are hundreds of them) coming with Windows Server 8, including Hyper-V enhancements, Active Directory changes, and PowerShell updates. It's probably the most exciting release of Windows Server since the 2000 edition, where we first saw Active Directory. Don't get me wrong, Windows Server 2003 and Windows 2008 are both excellent, but there's something special about the enterprise-oriented features promised for Windows Server 8 that has me giggly.
    Microsoft technology No. 4: Windows Azure updates
    Microsoft has been enhancing its Azure cloud-based offering with lower prices and new features like an SDK. The big thing to look for is the connection via Hortonworks with Apache Hadoop. I'll be honest -- none of this was interesting to me until I read Mary Jo Foley's review, then checked outMicrosoft's 10-minute Channel 9 video. The latter explains a bit more on how the integration will provide impressive access to both on-premise Windows Servers and Azure through tools with which admins are already familiar.
    Wishful thinking: Office 2012 serversI'll be honest -- though I hear buzz on the wire about Exchange 2012, SharePoint 2012, and Lync 2012, I don't think we'll see any of these this year. The 2010 flavor of each is still being deployed, and releasing new versions so soon could saturate the market for no reason. Based on Microsoft's history of server releases, three- or four-year gaps are the norm. But one can dream, and if 2012 versions of the Office servers appear, that'll be exciting.
    More to anticipate from RedmondI'm also looking forward to Windows 8, although I don't know how relevant it will be or how much of a dent it will make in enterprise deployments of Windows 7, which are doing well. (We'll finallybe rid of XP soon, I hope.) The 2012 revamp of System Center should also be interesting. Rest assured that 2012 will be anything but a dry year for new Microsoft technology.
    This article, "4 hot Microsoft technologies coming in 2012," was originally published atInfoWorld.com. Read more of J. Peter Bruzzese's Enterprise Windows blog and follow the latest developments in Windows at InfoWorld.com. For the latest business technology news, followInfoWorld.com on Twitter.

Nginx overtakes Microsoft as No. 2 Web server


Nginx overtakes Microsoft IIS as No. 2 Web server 
With financial backing from the likes of Michael Dell and other venture capitalists, open source upstart Nginx has edged out Microsoft IIS (Internet Information Server) to hold the title of second-most widely used Web server among all active websites. What's more, according to Netcraft's January 2012 Web Server Survey, Nginx over the past month has gained market share among all websites, whereas competitors Apache, Microsoft, and Google each lost share.
Nginx has made quite a splash since its creator Igor Syosev, along with Andrew Alexeev, co-founded the company last July. The platform, which debuted in 2004 and was designed for high-volume Web traffic, runs on some 25 percent of the world's 1,000 busiest websites, including Facebook, Zappos, Groupon, Hulu, Dropbox, and WordPress. Nginx last October received $3 million in funding from venture capital firms MSD Capital -- the Dell CEO's private investment company -- as well as BV Capital and Runa Capital.
According to Netcraft, Nginx now runs on 12.18 percent of all active websites -- that is, sites with unique content and aren't generated from templates -- for a total of around 22.2 million, according to Netcraft. In all, Netcraft found over 582 million websites for the January 2012 survey; around 18 million were active.
By comparison, Nginx ran on 11.6 percent of active sites (around 20.3 million in total) at the start of last December, meaning the open source platform's share jumped 0.57 percent. Microsoft IIS, by contrast, now runs on 12.14 percent of active websites, for a total of around 22.1 million. That represents a 0.17 percent drop compared to its December standing, when the platform powered 12.31 percent of active websites for a total of about 21.6 million.
Notably, although top-ranked Apache holds a healthy lead over the competition, running 57.93 percent of all active websites for a total of around 105.7 million, it too experienced a decline in overall share, down from 58.2 percent in December. Google, ranked No. 4 on Netcraft's list, saw its share dip as well, from 8.13 percent to 7.9 percent.
Taking into account servers for all sites covering all domains, Microsoft holds a healthy lead over Nginx, accounting for 14.46 percent of sites for a total of around 84.2 million. Nginx runs on 56.1 million of all sites, representing 9.63 percent. Apache dwarfs them both: 64.9 percent for a total of 378 million sites. Google rests in the fourth spot with a 7.9 percent market share, covering 14.4 million sites.
However, whereas Apache, Microsoft, and Google saw their market share drop, respectively, by 0.3 percent, 0.39 percent, and 0.07 percent since last month, Nginx's share jumped 0.78 percent.
Finally, among the world's 1 million busiest sites, Apache holds a market share of 64.4 percent (640,547 sites), down 0.36 percent since December; Microsoft's share is 14.99 percent (149,209 sites), down 0.01 percent; Nginx represents 8.49 percent (84,541 sites), up 0.28 percent month over month; and Google handles 2.4 percent (23,894 sites), an increase of 0.09 percent.
This article, "Nginx overtakes Microsoft IIS as No. 2 Web server," was originally published atInfoWorld.com. Get the first word on what the important tech news really means with the InfoWorld Tech Watch blog. For the latest business technology news, follow InfoWorld.com on Twitter

Explorer 10

Internet Explorer 9

Teknologi

Powered by Blogger.

Popular Posts

Followers

 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2011. miCroInfOworlD.com - All Rights Reserved
Template Modify by Creating Website
Proudly powered by Blogger