Posts tagged Cisco
Whatever you may think of its business practices, Microsoft has always been top-notch when it comes to developer tools. Visual Studio is widely hailed as the best IDE out there, and .Net is an intelligently designed platform that borrows the best of what Java has to offer and takes it a few steps further.
Nothing could be further from the truth. Looking past the Metro hype, the Build conference also revealed promising road maps for C#, Visual Studio, and the .Net platform as a whole.
Perhaps the most exciting demo of the conference for .Net developers, however, was Project Roslyn, a new technology that Microsoft made available yesterday as a Community Technology Preview (CTP). Roslyn aims to bring powerful new features to C#, Visual Basic, and Visual Studio, but it’s really much more than that. If it succeeds, it will reinvent how we view compilers and compiled languages altogether.
Deconstructing the compiler
Roslyn has been described as “compiler-as-a-service technology,” a term that’s caused a lot of confusion. I’ve even seen headlines heralding the project as “Microsoft’s cloud compiler service” or “bringing .Net to the cloud.” None of that is correct. Technically, it would be possible to offer code compilation as a cloud-based service, but it’s hard to see the advantage, except in special circumstances.
Roslyn isn’t services in the sense of software-as-a-service (SaaS), platform-as-a-service (PaaS), or similar cloud offerings. Rather, it’s services in the sense of Windows services. Roslyn is a complete reengineering of Microsoft’s .Net compiler toolchain in a new way, such that each phase of the code compilation process is exposed as a service that can be consumed by other applications.
As Microsoft’s Anders Hejlsberg explained in a Build conference session, “Traditionally, a compiler is just sort of a black box. On one side you feed it source files, magic happens, and out the other end comes object files, or assemblies, or whatever the output format is.”
Internally, however, there’s a lot more going on. Typically, first the compiler parses your source code and breaks it down into a syntax tree. Then it builds a list of all the symbols in your program. Then it begins binding the symbols with the appropriate objects and so on.
An ordinary compiler discards all of this intermediate information once the final code is output. But with Roslyn-enabled compilers, the data from each step is accessible via its own .Net APIs. For example, a call to one API will return the entire syntax tree of a given piece of code as an object. A call to another API might return the number of methods in the code.
So what is Roslyn good for?
The most obvious advantage of this kind of “deconstructed” compiler is that it allows the entire compile-execute process to be invoked from within .Net applications. Hejlsberg demonstrated a C# program that passed a few code snippets to the C# compiler as strings; the compiler returned the resulting IL assembly code as an object, which was then passed to the Common Language Runtime (CLR) for execution. Voilà! With Roslyn, C# gains a dynamic language’s ability to generate and invoke code at runtime.
Put that same code into a loop that accepts input from the user, and you’ve created a fully interactive read-eval-print loop (REPL) console for C#, allowing you to manipulate and experiment with .Net APIs and objects in real time. With the Roslyn technology, C# may still be a compiled language, but it effectively gains all the flexibility and expressiveness that dynamic languages such as Python and Ruby have to offer.
The separate phases of the compilation process have their uses, too. For example, according to a blog post by Microsoft’s Eric Lippert (Silverlight required), various groups have written their own C# language parsers, even within Microsoft. Maybe the Visual Studio team needed to write a syntax-coloring component, or maybe another group wanted to translate C# code into something else. In the past, each team would write its own parser, of varying quality. With Roslyn, they can simply access the compiler’s own syntax parser via an API and get back a syntax tree that’s exactly the same as what the compiler would use. (Roslyn even exposes a syntax-coloring API.)
The syntax and binding data exposed by the Roslyn APIs also makes code refactoring easier. It even allows developers to write their own code refactoring algorithms in addition to the ones that ship with Visual Studio.
Hejlsberg’s most remarkable demo, however, showed how Roslyn’s syntax tree APIs make it remarkably easy to translate source code from one CLR language to another. To illustrate, Hejlsberg copied some Visual Basic source code to the clipboard, opened a new file, and chose Paste as C#. The result was the same algorithm, only now written in C#. Translations back and forth don’t yield identical code — for loops might translate into, say, while loops — but in all cases the code was perfectly valid, ready to compile, execute, or refactor.
Can I have it now, please?
The catch: Hejlsberg wouldn’t commit to a ship date for the Roslyn technologies or even that they’d make it into a shipping Visual Studio release. For that matter, he wouldn’t comment on any future Visual Studio releases or whether there would be another version at all. Even the Roslyn CTP release is running a little late. At the Build conference running Sept. 13 to 16, Hejlsberg said it would arrive “in four weeks.” It arrived yesterday — a week late — instead.
Don’t think Roslyn is too far-fetched to happen, though. It’s actually very similar to the Mono project’s Mono.CSharp library, which exposes the Mono C# compiler as a service and enables a REPL console much like the one Hejlsberg demoed at Build. Mono.CSharp has been shipping with Mono since version 2.2.
The main drawback of Roslyn is that it’s a complete retooling of the .Net compilers, rather than of the platform itself. That means it’s limited to C# and Visual Basic, at least for its initial release. If developers using other .Net languages want to take advantage of Roslyn-like capabilities, those languages’ compilers will need to be completely rewritten.
But maybe they should be. If Microsoft succeeds with everything it has planned, Roslyn represents not merely a new iteration of the Visual Studio toolchain but a whole new way for developers to interact with their tools. It breaks down the barriers between compiled and dynamic languages and enables powerful new interactive capabilities in the coding process itself. It truly is one of the most ambitious and exciting innovations in compiler technology in a long time.
Number is growing by some 100,000 every day.
According to Microsoft, some 2 million iOS 5 devices are connected to Hotmail, and that number is growing by some 100,000 every day.
With the release of iOS 5 it became easier for user to configure their iDevice to send and receive email via Hotmail because Apple added it as a default option in the Add Account … screen. And it seems that users have embraced this option enthusiastically.
40% of those connecting to Hotmail use an iPhone 4, but remarkably a massive 24% are using iPhone 4S handsets. This is particularly amazing considering that the 4S has only been available for three weeks.
In just 18 months, Android has come from nowhere to become the mobile OS powering just under half of every smartphone sold in the UK – and the half the people owning a mobile phone in the UK have a smartphone.
In the process it has bested Nokia’s Symbian (since declared dead, though still stumbling to its grave), RIM’s BlackBerry OS (which is fighting back) and Apple’s iPhone (which, given its comparatively high price until the latest cuts to the iPhone 3GS and iPhone 4, was never likely to dominate long-term).
It’s an amazing run for Android which is likely to carry on into 2012, since it’s taken four years to reach this point (longer if you count Nokia’s, RIM’s and Microsoft’s offerings from 2005/6 as smartphones) but the number of smartphones being sold is accelerating.
What these figures don’t show you is that the entire market is growing; Kantar ComTech WorldPanel, which provides the statistics, declined to give absolute sales figures (they want to have something to tempt clients to buy the full reports). A minor note: these figures go up to 2 October, just before the iPhone 4S launch; expect that Apple’s share will recover slightly. Even so, Android is just going to keep growing.
It’s very likely that in the next two years you’ll see smartphones reach something like 90% penetration in the UK – if only because fewer shops and carriers will be selling feature phones, for two reasons: (a) they make less money selling them in the first place (b) carriers get less money from phones that don’t have data plans.
Android is almost certain to sweep the board here: it could hit up to 70% market share in one or two years (remember, market share is “share of handsets being sold”, not “share of handsets in peoples’ hands”). That’s because Android handsets from cheaper manufacturers such as China’s Huawei and ZTE will come in at the bottom of the market (it’s noticeable how the “Other” segment has fallen to zero in the past year). Pretty soon you’re going to be able to get a smartphone for almost nothing in your local supermarket. And you can already get really cheap PAYG data options from Three or GiffGaff.
For the record, I think it’s great that smartphones are becoming pervasive. Putting the internet in everyone’s hands, wherever they are? (If only the mobile carriers would stopholding up the 4G auction.) That’s got to be a really good thing.
Does it matter, though, whether the pervasive OS is Android, or what share this or that OS has, beyond the willy-waving horse race that some people love to indulge in? Here’s Henry Blodget over at BusinessInsider, who slams on the CAPS LOCK to pronounce ATTENTION APPLE FANS: Samsung Blowing Past Apple To Become The Biggest Smartphone Vendor Is Not Good News”. (By which he means not good news for Apple. Though by implication, it would also be Not Good News for RIM and Nokia either.) Blodget’s take: As the history of the tech industry has demonstrated again and again, technology platform markets tend to standardize around a single dominant platform. Although several different platforms can co-exist while a market is developing, eventually a clear leader emerges. And as it does, the leader’s power and “network effects” grow, while the leverage of the smaller platforms diminishes.
In the case of Android, this growing power will not lead to enormous profits for Google, because, right now anyway, Google is not selling Android. (Instead, Google is building a “moat” around its wildly profitable search business and making it easier for people to use Google search from their phones. This may change when Google acquires Motorola and starts selling integrated handsets itself.) But the better Android phones get, and the more market share Android gains, the more Android’s network effects will increase, and the more Apple’s leverage over the iPhone ecosystem will diminish. And that can only be bad news for Apple’s ability to continue to command exploding profits from iPhones, app developers, musicians, media companies, and others who now must pay it big distribution fees because they have no other choice.Similarly, the bigger other global handset manufacturers get relative to Apple, the less (relative) leverage Apple will have over partners in the global parts-and-manufacturing supply chains.
There’s three things there. Let’s take the last one first: supply chains. Apple didn’t do badly in 2007 when it was an entrant to the mobile phone supply chain, and it’s got enough money in the bank that it can guarantee supplies any time it likes. (That’s what it uses its cash reserves outside the US to do: buy up future outputs from various factories.) Most smartphone manufacturers don’t have much scale; that’s unlikely to change. Samsung is likely to get bigger (though it would be helpful if it would be more forthcoming about how many phones and how many tablets it has pushed out the door). That won’t stop Apple making phones, though. And by proxy, it won’t stop RIM or Nokia making phones – Nokia is still the world’s largest in handset volume. Only mismanagement can mess that up.
Now to the first point, about “the history of the tech industry”. Actually, the history of the tech industry is a wide and varied thing, which doesn’t show any clear lessons about dominant platforms. Yes, you do get dominant platforms, but that doesn’t prevent other players existing within niches and making good money from it. Cite 1: Apple, making nice money, thanks, from the PC market. Cite 2: Microsoft, making nice money, thanks, from the server OS market, despite Linux being the most-used. The leverage of Apple and Microsoft in those respective spaces is helped by the existence of standards, and it’s those – plus the internet – which make the “platform” idea less powerful on smartphones.
Think of it like this: if the PC market had started when the internet was already pervasive, then operating systems would have had to have internet standards built in; that would have forced more interoperability. It was the threat that Netscape might force interoperability on all computing platforms that scared the bejeezus out of Microsoft in the 1990s. So smartphones, which are arriving when the internet is pervasive, will live by different standards.
Horace Dediu, who runs the consultancy Asymco, puts it like this: imagine a world where 5 billion people have a smartphone. In that case, a 10% share translates to 500 million users. Even a 1% share is 50 million. If you couldn’t make a profit from 50 million users, you probably shouldn’t be in the business at all.
And just a side note on that “wildly profitable” Google search from their phones. All the web stats, and Google’s own stats, indicate that – for now anyway – about two-thirds or more of mobile web browsing and searching is mostly done by iOS users (iPhone, iPod Touch, iPad). In some places it’s much higher. Now, past performance is not necessarily a guide to the future (you only have to look at the graph to see that). But you have to ask too: what exactly is the “network effect” that Blodget thinks Google will get from Android? People writing apps? They already do; but it hasn’t dented the bigger platforms.
The interesting challenge will be for Nokia and RIM, which have to establish themselves at the higher end of the market as everything shifts to smartphones. But in a growing market, the only problem is how to supply enough people. Android’s a whopping success. But that doesn’t shut anyone out – yet.
Ask most people to name a productivity suite and chances are they’ll say Microsoft Office, but they might also name one of the numerous competitors that have sprung up. None have completely displaced the Microsoft monolith, but they’ve made inroads.
Most of the competition has positioned itself as being better by being cheaper. SoftMaker Office has demonstrated you don’t always need to pay Microsoft’s prices to get some of the same quality, while OpenOffice.org proved you might not need to pay anything at all. Meanwhile, services like Google Docs are available for anyone with an Internet connection.
Microsoft’s response has been to issue the newest version of Office (2010) in three retail editions with slightly less ornery pricing than before, as well as a free, ad-supported version (Microsoft Office Starter Edition) that comes preloaded on new PCs. Despite the budget-friendly competition, Office continues to sell, with Microsoft claiming back in January that one copy of Office 2010 is sold somewhere in the world every second. (Full disclosure: The author of this review recently bought a copy for his own use.)
How well do the alternatives shape up? And how practical is it to switch to them when you have an existing array of documents created in Microsoft Office? Those are the questions I had in mind when I sat down with both the new version of Microsoft Office and several other programs (and one cloud service) that have been positioned as low- or no-cost replacements.
Microsoft Office 2010
Despite all efforts to dethrone it, Microsoft Office remains the de facto standard for word processing, spreadsheets, presentations, and to a high degree, corporate email. Other programs may have individual features that are better implemented, but Microsoft has made the whole package work together, both across the different programs in the suite and in Windows itself, with increasing care and attention in each revision.
If you avoided Office 2007 because of the radical changes to the interface — namely, the ribbon that replaced the conventional icon toolbars — three years’ time might change your mind. First, the ribbon’s no longer confined to Office only; it shows up in many other programs and isn’t as alien as before. Second, Microsoft addressed one major complaint about the ribbon — that it wasn’t customizable — and made it possible in Office 2010 for end-users to organize the ribbon as freely as they did their legacy toolbars. I’m irked Microsoft didn’t make this possible with the ribbon from the start, but at least it’s there now.
Finally, the ribbon is now implemented consistently in Office 2010. Whereas Outlook 2007 displayed the ribbon only when editing messages, Outlook 2010 uses the ribbon throughout. (The rest of Outlook has also been streamlined a great deal; the thicket of settings and submenus has been pruned down a bit and made easier to traverse.) One feature that would be hugely useful is a type-to-find function for the ribbon; there is an add-in that accomplishes this, but having it as a native feature would be great.
Aside from the interface changes, Office 2007’s other biggest alteration was a new XML-based document format. Office 2010 keeps the new format but expands backward- and cross-compatibility, as well as native handling of OpenDocument Format (ODF) documents — the .odt, .ods, and .odp formats used by OpenOffice.org. When you open a legacy Word .doc or .rtf file, for instance, the legend “[Compatibility Mode]” appears in the window title. This means any functions not native to that document format are disabled, so edits to the document can be reopened without problems in earlier versions of Office.
Note that ODF documents don’t trigger compatibility mode, since Office 2010 claims to have a high degree of compatibility between the two. The problem is “high degree” doesn’t always mean perfect compatibility. If you highlight a passage in an ODF document while in Word 2010, OpenOffice.org and LibreOffice recognize the highlighting. But if you highlight in OpenOffice.org or LibreOffice, Word 2010 interprets the highlighting as merely a background color assignment for the selected text.
Exporting to HTML is, sadly, still messy; Word has never been good at exporting simple HTML that preserves only basic markup. Also, exporting to PDF is available natively, but the range of options in Word’s PDF export module is very narrow compared to that of OpenOffice.org.
Many other little changes throughout Office 2010 ease daily work. I particularly like the way the “find” function works in Word now, where all the results in a given document are shown in a navigation pane. This makes it far easier to find that one occurrence of a phrase you’re looking for. Excel has some nifty new ways to represent and manipulate data: Sparklines, little in-cell charts that usefully display at-a-glance visualizations of data; and data slicers, multiple-choice selectors that help widen or narrow the scope of the data you’re looking at. PowerPoint lets you broadcast a presentation across the Web (via Microsoft’s PowerPoint Broadcast Service, the use of which comes free with a PowerPoint license) or save a presentation as a video.
One last feature is worth mentioning as a possible future direction for all products in this vein. Office users who also have a SharePoint server can now collaborate in real time on Word, PowerPoint, or Excel documents. Unfortunately, SharePoint is way out of the reach of most casual users. But given how many professional-level features in software generally have percolated down to the end-user level, I wouldn’t be surprised if Microsoft eventually adds real-time collaboration, perhaps through Windows Live Mesh, as a standard feature.
Microsoft Office 2010 takes on all comers: OpenOffice.org 3.3.0
OpenOffice.org has long been a commonly suggested replacement for Microsoft Office. It offers several common office-suite features at a much lower price — free — than Microsoft Office itself, although many of those individual features don’t have the level of polish or advancement found in commercial office-suite products. That said, for people who don’t need the absolute latest and greatest functionality in every category, OpenOffice.org is a solid piece of software. (In the interest of full disclosure, again, I admit I have been frustrated by its limitations, but I can recognize that for many other people it will more than do the job.)
Don’t be thrown off if you come to OpenOffice.org from the Microsoft Office side. The program’s UI is very vintage 2003 — dockable toolbars instead of the newer ribbon/tab metaphors that are now all the rage. That said, future versions of OpenOffice.org may sport a more modern look, although this is still very much under wraps — nothing more than mock-up designs of such a UI have surfaced yet.
Is Microsoft using a next-generation computing boot-loading technology to lock out the use of Linux and other OSEs on certain computers? While Microsoft has denied malicious intent, one Red Hat developer maintains that this may be the case.
Microsoft is mandating the use of the UEFI (Unified Extensible Firmware Interface) secure boot-loading capability with Windows 8 in such a way that “the end user is no longer in control of their PC,” charged Red Hat developer Matthew Garrett in a blog entry posted Friday.
Microsoft has claimed that this charge is based on a misunderstanding of the company’s intentions. “At the end of the day, the customer is in control of their PC,” said Microsoft program manager Tony Mangefeste in another blog posting from Microsoft.
The controversy took root on Tuesday, when Garrett pointed out in a blog posting that Microsoft-certified computers running Windows 8 may not be able to be loaded with copies of other OSes, such as Linux. Users could not install Linux as a second OS, or replace Windows with a copy of Linux, Garrett argued.
Windows 8 will require its host computer to use the UEFI, the low-level interface between the computer firmware and the OS. Marketed as a replacement to BIOS, UEFI provides a secure boot protocol, which requires the OS to furnish a digital key in order to be loaded by the machine. UEFI then can block the operations of any programs or drivers unless they have been signed by this key, a move that should prevent malware from infecting machines by changing the boot-loading process.
With Windows 8, Microsoft will require hardware manufacturers (those wishing to display the Windows logo on their units) to ship their machines with secure boot enabled. Each machine would then require a digital key from Microsoft, the hardware manufacturer or, if it uses another OS, a secure key for that OS.
Users who customize their own versions of Linux, or use a generic OS that does not come with a key, may not be able to run these OSes on machines requiring this secure booting process, Garrett said. Nor would there be any guarantee that OEMs (original equipment manufacturers) even provide the ability for users to add their own keys, or give users the option to run other OSes without a key.
Garrett’s blog post subsequently sparked debate in the trade press and Linux user communities.
Responding to the controversy on Thursday, Microsoft has denied that the intent was to shut out Linux. Although he did not mention Linux by name, Steven Sinofsky, president of the Windows and Windows Live Division, noted in a blog post that some of those commenting have used details of the new plan to “synthesize scenarios that are not the case.”
The rest of the posting, authored by Mangefeste, noted that Microsoft is concerned only that Windows 8 be protected in a secure boot loader, and that OEMs are free to build in the option of disabling secure boot for running OSes without keys. Other OS providers are responsible for providing their own keys.
“For the enthusiast who wants to run older operating systems, the option is there to allow you to make that decision,” Mangefeste wrote. “However, [disabling secure boot] comes at your own risk,” he added.
“Microsoft’s rebuttal is entirely factually accurate. But it’s also misleading,” Garrett responded in a follow-up blog item, posted Friday. Under the licensing agreement, the equipment manufacturer is under no obligation to provide users with the ability to disable the secure boot capability. Beyond the use of third-party OSes, this approach might also hamper the ability of users to upgrade components such as graphics cards, because there is no requirement to provide the user with the capability of installing additional keys.
“The truth is that UEFI secure boot is a valuable and worthwhile feature that Microsoft are misusing to gain tighter control over the market,” Garrett charged.
Proving naysayers incorrect once again, Microsoft posted a banner fiscal 2011 year in revenue, as sales of Microsoft Office, server software and Xbox continued to drive growth, even as sales of Windows leveled off, according to the company.
For the fiscal year ending June 30, Microsoft generated US$69.94 billion in revenue, an all-time high for the company and a 12 percent increase in revenue compared to fiscal 2010. Of this revenue, $23.15 billion was net income, a 23 percent increase from the prior year.
For the fourth quarter of fiscal 2011, Microsoft reported revenue of $17.37 billion, an 8 percent increase from the same period a year prior. Net income was $5.87 billion, a 30 percent increase.
Microsoft Business Division’s revenue for the fourth quarter grew by 16 percent for the year and 7 percent for the quarter, thanks to the recent launch of Microsoft Office 2010, which has already sold 100 million licenses. The division reported $5.8 billion in revenue for the fourth quarter and $22 billion for the full year, eclipsing revenue of Microsoft’s flagship Windows and Windows Live Division, which oversees the Windows operating system.
Windows and Windows Live Division actually declined by 2 percent for the year, and 1 percent for the quarter, as demand for personal computers stagnated during this period of time. This division posted $4.74 billion in revenue for the fourth quarter and $19.02 billion for the year.
Revenue from Microsoft’s Server and Tools division grew by 11 percent for the full year and 12 percent for the fourth quarter, as increased demand for Windows Server, System Center, and SQL Server continued unabated. Server and Tools reported $4.6 billion for the fourth quarter and $17 billion for the year.
The Entertainment and Devices Division posted the largest revenue growth for the company, swelling sales by 45 percent for the full year and 30 percent for the fourth quarter, thanks to sales of the Xbox game console and associated games and services. This division generated $1.5 billion in revenue for the quarter and $8.9 billion for the year.
For the fiscal year 2010 overall, the company reported revenue of $62.48 billion, and net income of $18.76 billion. That year, Microsoft recorded fourth quarter revenue of $16.04 billion and net income was $4.52 billion.
With these earnings, Microsoft bested analyst estimates across the board. Analysts expected the company to generate $17.23 billion in revenue for the quarter and $61.72 billion for the fiscal year, according to a poll by Thomson One Analytics. Net income was expected to come in at $4.9 billion for the fourth quarter and $22 billion for the year.
Mozilla today said that income from its search partners, including rival browser maker Google, increased by 19% last year.
Royalties, almost all of which come from search services like Google, Microsoft, Yahoo and others, were $121.1 million, up 19.3% from 2009’s $101.5 million.
The vast bulk of the Mozilla Foundation’s revenues came from search providers, which paid the organization for leading Firefox users to their websites. In 2010, royalty payments accounted for 98% of the year’s revenues, a percentage point higher than the share of Mozilla’s income attributed to search in the two years before.
Mozilla Foundation is the not-for-profit organization that oversees Mozilla Corp., the commercial firm that develops Firefox.
According to the audited financial statement (download PDF) released Monday, total revenues for 2010 were $121.1 million, up 18.1% from 2009’s $104.3 million.
Revenue growth last year was just over half that of the 34% increase Mozilla touted for 2009.
This was the second annual report in a row that Mozilla did not disclose the individual amounts it received from its search partners.
Instead, in a FAQ tied to the report, Mozilla repeated nearly word-for-word a line it used last year: “The majority of Mozilla’s revenue continues to be generated from the search functionality included in our Mozilla’s Firefox product through all major search partners including Google, Bing, Yahoo, Yandex, Amazon, eBay and others.”
Historically, Google has accounted for most of Mozilla’s search royalties; in 2008, Google’s payments made up 88% of the total. That reliance on a rival often raises questions about Mozilla’s income stability. Google, after all, creates Chrome, the browser that is gaining on Firefox, which currently has the No. 2 spot behind Microsoft’s Internet Explorer (IE).
Mozilla’s contract with Google expires next month, something Mozilla acknowledged but did not specifically predict would be renewed.
“Our largest contract, with Google, comes up for renewal in November,” said Mozilla in the FAQ. “We have every confidence that search partnerships will remain a solid generator of revenue for Mozilla for the foreseeable future.”
But it’s unlikely that Google would ditch the deal, said Al Hilwa, an analyst with IDC.
“These contracts are based on access to users who might be exposed to Google through the browser … [and] Firefox still has a significant share that is worthwhile to pay for,” said Hilwa. “I don’t see the value for Google to abandon that crowd because doing so is not going to help Chrome gain any more traction.”
According to a pair of Web analytics firms, Chrome will oust Firefox as the second-most-used browser sometime between the end of this year and the middle of 2012.
U.S.-based Net Applications, for example, said Chrome’s share of all browsers used in September was 16.2%, while Firefox’s was 22.5%.
Mozilla also noted its 2011 revenues included some “very important individual and corporate donations.”
The organization’s 2010 tax return (download PDF) listed several such donations, including one for $735,000 and another for $175,000. Mozilla did not identify the names of the persons or companies which donated money, however.
Microsoft will make no more Zune music players, building its future music strategy on applications incorporated in its Windows Phone and Xbox platforms, the company has confirmed.
Rumors circulated in March that Microsoft planned to stop making dedicated music players, but the company ducked the issue then, saying this year’s new Zune devices would be mobile phones running Zune software. It went on to release a trickle of applications for the Zune platform later in the year.
The company has now updated a help page at Zune.net to announce: “We will no longer be producing Zune players.” Instead, the page said: “Going forward, Windows Phone will be the focus of our mobile music and video strategy.”
However, this will make no difference to current Zune users, the company said on the support page: “Your device will continue to work with Zune services just as it does today. And we will continue to honor the warranties of all devices for both current owners and those who buy our very last devices.”
Microsoft launched the first Zune players and Zune Marketplace music store on Nov. 14, 2006, as a challenge to Apple’s iPod player and iTunes music store.
However, Apple moved the goalposts a couple of months later with its Jan. 9, 2007, announcement of the iPhone, a widescreen iPod that could also make phone calls and surf the Web.
It took Microsoft three years to follow suit. When it announced Windows Phone in February 2010, one of the features of the new mobile OS was a Zune music player app. It’s also possible to access Zune music and video via Xbox Live, Microsoft’s online service for its Xbox 360 game console.
Sales of the Zune have consistently trailed far behind those of the iPod. While not a definitive ranking, the list of best-selling MP3 players at Amazon.com is telling: Nine of the 10 best sellers are iPods (Sandisk has a $40 Sansa model in eighth place) and the first Zune device now appears at number 24, preceded by 16 iPod variants.
Guest post: In the midst of staggering business and economic turmoil, you need all the tools at your disposal to help shore up your career prospects. See why TechRepublic’s Erik Eckel believes IT certification will become increasingly important for IT pros and the organizations that rely on them. You can find more posts like this […]
Guest post: In the midst of staggering business and economic turmoil, you need all the tools at your disposal to help shore up your career prospects. See why TechRepublic’s Erik Eckel believes IT certification will become increasingly important for IT pros and the organizations that rely on them. You can find more posts like this on TechRepublic’s 10 Things blog.
Many technology professionals believe IT certifications reached their peak during the height of the dot-com boom. But such a mindset may well prove shortsighted. The subsequent dot-com bomb led to an exodus of certified technicians from the industry. Then, as the dust settled, IT certifications were reworked. Accreditations were better mapped to real-world needs and expertise. Program flaws were eliminated. Training programs improved.
Now, in turbulent economic times, IT certifications will provide more relevance than ever before. With unprecedented bailouts, widespread cost and workforce reductions, and a slew of new platforms being released, IT accreditations will assume renewed importance in 2009. Here are 10 reasons why IT certifications will prove important in 2009.
Note: This information is also available as a PDF download.
#1: Job retention
Organizations are laying off employees at alarming levels. When wildly successful franchises such as the National Football League downsize 10 percent, you know the economy’s in trouble.
When faced with difficult personnel decisions, organizations generally try to retain the most skillful and knowledgeable employees. Certified IT pros have a credible advantage over their colleagues. While holding a current IT accreditation is no guarantee against being laid off, the more education, expertise, and skills you can demonstrate, the better.
#2: Salary maintenance
Many organizations — and city and state governments in particular — are asking employees to accept salary reductions. Whereas staff may have grumbled over a scant four or five percent salary increase a few years ago, today many are being asked to cut their compensation by those amounts.
Holding current IT certification does not guarantee you won’t face salary reductions. But possessing specific certifications — including A+, Security+, Microsoft credentials, and other accreditations –often qualifies employees for higher pay grades. Thus, when forced to accept a salary reduction, you’re more likely to be earning more than your non-certified colleagues.
#3: Hiring and promotion eligibility
Despite the economic downturn, some companies are still hiring. Others are actively promoting from within. Recent headlines prove medical facilities, health insurance companies, and manufacturers, among others, continue expansion efforts.
Significant competition exists for these open positions. With unemployment exceeding six percent, a number expected to grow in 2009, jockeying for good jobs will only increase. If your resume is bolstered by new and timely certifications, you’ll gain an advantage over others applying for the same role. For better or worse, in cases where two otherwise equal candidates are competing for the same lucrative job offer, one applicant’s certifications could prove the deciding factor. Certification may even be required to apply for the position.
#4: Career improvement
Many technology professionals feel they’ve done all they can do as a support technician or network administrator. They may be working in positions where they’ll receive no additional responsibilities, pay, or challenges unless they move up the corporate ladder.
IT certifications can certainly open the door to such promotions. By completing project management training and proving command of the fundamentals by earning a Project Management Institute (PMI) Project Management Professional or CompTIA Project+ certification, an administrator can demonstrate initiative and expertise in an effort to win a project management promotion. Likewise, a support tech might leverage a Microsoft Certified IT Professional (MCITP) accreditation to gain a new position as a server administrator.
#5: New-generation certs increase relevance
Certifications are receiving a boost from considerable reworking. Many organizations, including CompTIA and Cisco, are revamping and redesigning exams and instructional initiatives. And Microsoft really stands out due to the variety and impact of changes made to its training and certification program.
Microsoft’s new generation of certifications — including the new MCITP, Microsoft Certified Technology Specialist (MCTS), and Microsoft Certified Professional Developer (MCPD) accreditations — map directly to real-world needs. The MCTS, for example, measures a candidate’s skill, knowledge, and expertise deploying, maintaining, and administering specific Microsoft platforms.
Microsoft’s new MCITP credential, meanwhile, is aimed at helping organizations meet specific staffing needs. The certification is designed to demonstrate expertise within job roles, such as server administrator or desktop support technician, thereby better enabling hiring managers to spot qualified, well-targeted candidates.
To keep these new-generation certifications relevant, Microsoft is expiring new credentials when mainstream support for the corresponding technology platforms is retired. Those changes, combined with the introduction of classroom and lab training requirements for new higher-level certifications, are helping put the shine back on IT certifications in 2009.
#6: Organizations will become more discriminating
Consultants can benefit from IT certifications in 2009, too. As clients more closely guard expenses and become more discerning, organizations needing to outsource computer services and support will want to ensure the firms and technicians they hire are competent. IT certifications are a great method for consultants to demonstrate their skill, knowledge, and expertise to potential clients.
#7: New products will gain momentum
A slew of new products is sure to gain momentum in 2009. Microsoft’s 2008-branded server products lead the charge of new technologies that will gain market share throughout the year.
As organizations begin replacing older or failed equipment with these new products, and as myriad other factors require that the new platforms be deployed, these organization will seek qualified IT technicians, managers, and consultants to plan, deploy, and administer the upgrades. If you can demonstrate your skills and expertise with these platforms, you’ll be better positioned to provide those services. By becoming certified on new technologies that gain traction in 2009, you’ll not only strengthen your resume, but you’ll also position yourself well by aligning your expertise with these new products.
#8: Organizations must minimize downtime
Server, desktop, and network downtime, as well as mean times to repair, must improve. This is true for most every organization, but especially for those that have reduced staff, as fewer employees are available to pick up the slack when errors or failures occur.
When running lean, as many companies have been forced to do, remaining employees’ workloads are often increased. Thus, it’s imperative that organizations fully utilize remaining staff.
IT certification programs are one method of ensuring that staff members have the training and instruction required to fulfill specific responsibilities. Employees who are better trained and educated as the result of certification efforts will be less likely to commit errors that lead to failure. And when outages do occur, the corresponding education and training will prove helpful in speeding recovery.
#9: Organizations need to reduce costs
When sales or funding levels dive, reducing operating costs becomes critical. During periods of recession, organizations are obligated to maximize efficiency. As a result, productivity requirements become greater for each worker.
From a cold and calculating perspective, IT certification is one proven method for leveraging an organization’s salary expenses. By ensuring that technicians have specific skills via training and certification programs, whether those skills target desktop support or network design and optimization, organizations know that IT certification efforts help maximize ROI.
A Kotler Marketing Group study published by CompTIA revealed certifications enabled organizations to reduce expenses, identify knowledge gaps, and improve productivity. In addition, certifications proved helpful in improving uptime and reducing turnover.
#10: Confidence proves handy during turbulent times
If nothing else, during periods of stress and upheaval, it helps to have confidence. While you can’t insulate yourself from major economic trends, you can leverage certifications to know you’ve taken prudent steps to keep skills current and make yourself an attractive employee, both to your current employer and to prospective hiring managers, should a pink slip arrive.
Some 75 percent of IT professionals responding to the Kotler study said their CompTIA certifications make them more attractive to employers, while 84 percent believe they now have the skills necessary to fulfill a job’s requirements. Further, some 93% agreed or strongly agreed that customers felt they are in good hands when working with them, due in part to their certifications.
With numerous other factors seemingly out of your control, IT certifications present at least one element you can command. In an age of unprecedented business and economic turmoil, the resulting confidence boost can only help.
Here’s a list of the 10 accreditations with the greatest potential for technology support professionals, administrators, and managers seeking employment within consulting firms or small and midsize organizations.
By Erik Eckel
Just as with many popular arguments — Red Sox v. Yankees, Chelsea v. Manchester United, Ford v. Chevy — IT certifications are popular fodder for debate. Except that certifications, in an IT professional’s microcosm of a world, have a bigger impact on the future. Just which certifications hold the most value today? Here’s my list of the 10 accreditations with the greatest potential for technology support professionals, administrators, and managers seeking employment within consulting firms or small and midsize organizations.
This best certification list could be built using 10 Microsoft certifications, many of which would be MCITP accreditations. The world runs on Microsoft. Those professionals earning Microsoft Certified IT Professional (MCITP) certification give employers and clients confidence that they’ve developed the knowledge and skills necessary to plan, deploy, support, maintain, and optimize Windows technologies. Specifically, the Enterprise Desktop Administrator 7 and Server Administrator tracks hold great appeal, as will Enterprise Messaging Administrator 2010, as older Exchange servers are retired in favor of the newer platform.
With operating systems (Windows 2000, 2003, 2008, etc.) cycling through every several years, many IT professionals simply aren’t going to invest the effort to earn MCITP or MCSE accreditation on every version. That’s understandable. But mastering a single exam, especially when available examinations help IT pros demonstrate expertise with such popular platforms as Windows Server 2008, Windows 7, and Microsoft SQL Server 2008, is more than reasonable. That’s why the Microsoft Certified Technology Specialist (MCTS) accreditation earns a spot on the list; it provides the opportunity for IT pros to demonstrate expertise on a specific technology that an organization may require right here, right now.
There’s simply no denying that IT professionals must know and understand the network principles and concepts that power everything within an organization’s IT infrastructure, whether running Windows, Linux, Apple, or other technologies. Instead of dismissing CompTIA’s Network+ as a baseline accreditation, every IT professional should add it to their resume.
Just as with CompTIA’s Network+ certification, the A+ accreditation is another cert that all IT professionals should have on their resume. Proving baseline knowledge and expertise with the hardware components that power today’s computers should be required of all technicians. I’m amazed at the number of smart, intelligent, and seasoned IT pros who aren’t sure how to crack the case of a Sony Vaio or diagnose failed capacitors with a simple glance. The more industry staff can learn about the fundamental hardware components, the better.
SonicWALLs power countless SMB VPNs. The company’s network devices also provide firewall and routing services, while extending gateway and perimeter security protections to organizations of all sizes. By gaining Certified SonicWALL Security Administrator (CSSA) certification, engineers can demonstrate their mastery of network security essentials, secure remote access, or secure wireless administration. There’s an immediate need for engineers with the knowledge and expertise required to configure and troubleshoot SonicWALL devices providing security services.
Although SonicWALL has eaten some of Cisco’s lunch, the demand for Cisco skills remains strong. Adding Cisco Certified Network Associate (CCNA) expertise to your resume does no harm and helps convince larger organizations, in particular, that you have the knowledge and skills necessary to deploy and troubleshoot Cisco routing and switching hardware.
Here’s where the debate really begins. Increasingly, my office is being asked to deploy and administer Mac OS X networks. In the real world, divorced from IT-industry rhetoric, we’re being asked to replace older Windows networks with Mac OS X client-server environments. We’re particularly seeing Apple traction within nonprofit environments. We’ve found the best bet is to get up to speed on the technologies clients are requesting, so it stands to reason that earning Apple Certified Technical Coordinator (ACTC) 10.6 accreditation won’t hurt. In fact, developing mastery over Mac OS X Snow Leopard Server will help provide confidence needed to actually begin pursuing Apple projects, instead of reactively responding to client requests to deploy and maintain Apple infrastructure.
Apple Certified Support Professional (ACSP) 10.6 accreditation helps IT professionals demonstrate expertise supporting Mac OS X client workstations. If you work for a single organization, and that firm doesn’t use Macs, you won’t need this certification. But larger organizations adding Macs due to demand within different departments or consultants working with a wide client base will do well to ensure they have Snow Leopard client skills. The ACSP is the perfect way to prove mastery.
Unchanged from the last 10 best certifications list, ISC2’s security accreditation for industry professionals with at least five years of full-time experience is internationally recognized for its value and validity. The Certified Information Systems Security Professional (CISSP) title demonstrates expertise with operations and network security, subjects that will only increase in importance as legal compliance, privacy, and risk mitigation continue commanding larger organizations’ attention.
I fear organizations begin cutting project managers first when times get tough. Management roles and responsibilities often get passed to technical staff when layoffs occur. Even in challenging economic times, though, IT departments require staff familiar with planning, scheduling, budgeting, and project management. That’s why the Project Management Institute’s (PMI) Project Management Professional (PMP) accreditation makes the list. The cert measures candidates’ expertise in managing and planning projects, budgeting expenses, and keeping initiatives on track. While there’s an argument to place CompTIA’s Project+ certification in this slot, PMI is a respected organization that exists solely to further professional project management and, as such, deserves the nod.
Honorable mentions: MCSE, ITIL, RHCP, Linux+, VCP, ACE, QuickBooks, Security+
In the previous version of this article, readers asked where NetWare certification stands. It’s not on the list. That’s not a mistake. It’s gone the way of BNC connectors, in my opinion. Microsoft owns the market. MCSEs have more value.
ITIL has its place, particularly in larger environments. RHCP (or Linux+) and VCP have roles within enterprises dependent upon Red Hat/Linux and VMware virtualization technologies certainly, but those organizations remain hit or miss.
Acronis’ ACE deserves a look. With some 3 million systems being backed up now by Acronis image software, it would behoove technology professionals to learn how to properly use the software. I think it’s fair to say there’s still some confusion as to the software’s tremendous potential.
SMBs are also demonstrating a surge of interest in QuickBooks technologies. From QuickBooks Point-of-Sale to QuickBooks Enterprise platforms, there’s strong, growing demand for QuickBooks expertise in the field. The company’s growth is impressive. There’s no other way to describe it. In a crappy economy, Intuit’s growing.
Security+, really, is a no brainer, but I’ll get lit up if I include nothing but CompTIA certifications in the top 10 list. However, my advice for anyone entering the industry or even veterans seeking their first accreditations would be to load up on CompTIA certs. How can you go wrong with the manufacturer-independent certifications that demonstrate mastery of fundamentals across a range of topics, including project management, hardware, networking, security, and voice networks? You could do much worse.
A word on the methodology
There’s no double-blind statistically valid data analysis run through a Bayesian probability calculus formula here. I’ve worked in IT long enough, however, and with enough different SMBs, to know what skills we need when the firm I co-own hires engineers and sends technicians onsite to deploy new systems or troubleshoot issues.
Sure, I could have thrown in ITIL to satisfy enterprise professionals, included RHCP to sate the rabid open source crowd, and added VCP to look hip modernizing the list with a virtualization element. But I’m just not seeing the demand for those skills in companies with up to several hundred employees. My firm’s been asked to deploy exactly one Linux server in almost seven years. And we’ve virtualized maybe a dozen systems. Therefore, I feel it would be a disservice to readers to include such accreditations when I see, on a daily basis, vastly greater demand for these other skill sets.
Erik Eckel is president of two privately held technology consulting companies. He previously served as executive editor at TechRepublic. Read his full bio and profile.