There's been a slight speedbump in the steady drive by hyperscale service providers toward ever more compute, storage and networking gear in their datacenters.
The hyperscale service providers are companies like Amazon Web Services (AWS), Microsoft and Alphabet, which each are building and expanding dozens of datacenters worldwide to provide public cloud services to business and government customers.
As those companies and a few other huge competitors have physically positioned themselves to dominate the public cloud service market, their massive purchases have become an increasingly large portion of overall IT infrastructure spending.
IDC this week released figures for the second quarter of 2019 (April-June), showing a decline of 15 percent compared to the year-ago quarter in vendor revenue from hardware infrastructure sales to public cloud environments. The revenue total was $9.4 billion for the quarter.
As the Framingham, Mass.-based research company's statement put it: "This segment of the market continues to be highly impacted by demand from a handful of hyperscale service providers, whose spending on IT infrastructure tends to have visible up and down swings."
In all, sales of IT infrastructure products for cloud environments dropped by 10 percent quarter-over-quarter, when including private cloud spending, which also decreased but at a less severe pace of about 1 percent.
The market researchers had anticipated declines this year after a free-spending 2018. In the first-quarter report back in June, IDC's Natalya Yezhkova cautioned against reading too much into the expected drop-off.
"As the overall IT infrastructure goes through a period of slowdown after an outstanding 2018, the important trends might look somewhat distorted in the short term," Yezhkova, research vice president of Infrastructure Systems, Platforms and Technologies at IDC, said in a statement. "IDC's long-term expectations strongly back continuous growth of cloud IT infrastructure environments. With vendors and service providers finding new ways of delivering cloud services, including from IT infrastructure deployed at customer premises, end users have fewer obstacles and pain points in adopting cloud/services-based IT."
With its September update, which includes Q2 results, IDC is now calling for the public cloud IT infrastructure segment to drop by nearly 7 percent from 2018 to about $42 billion in vendor revenues for all of 2019.
The pullback in spending is delaying the crossover point when more money will be spent on cloud infrastructure (both public cloud and private cloud) than on traditional IT environments. While cloud IT infrastructure spending was already briefly higher than traditional IT infrastructure spending in the third quarter of 2018, IDC now forecasts that for the full year of 2019, slightly more money will be spent on traditional IT infrastructure than cloud IT infrastructure. As for the public/private cloud mix, about two-thirds of cloud IT infrastructure spending comes from the hyperscale, public cloud providers.
This speedbump aside, IDC anticipates steady growth in spending on cloud IT infrastructure in 2020 and beyond -- leading to the cloud side outpacing the traditional side in revenues on a sustained basis.
Posted by Scott Bekker on 09/26/2019 at 10:59 AM0 comments
Kirill Tatarinov, former president and CEO of Citrix and a longtime senior executive at Microsoft, is taking on a senior executive role at data protection specialist Acronis.
Tatarinov this month was named executive vice chairman and will report to Founder and CEO Serguei Beloussov. Tatarinov has been a member of the Acronis board for the last 10 months.
"Kirill will be an active member of the executive team, helping to accelerate Acronis' cyber strategy execution and scale the Acronis Cyber Platform and Acronis Cyber Infrastructure business. His experience with innovations at scale will help us to deliver easy, efficient and secure cyber protection to customers of any size," Beloussov said in a statement announcing Tatarinov's new role.
Tatarinov was president and CEO at Citrix for a year and a half from early 2016 to mid-2017. His tenure at Microsoft spanned 13 years and included stints as president of the Microsoft Business Solutions Division and corporate vice president of the Management and Solutions Division.
Posted by Scott Bekker on 09/16/2019 at 9:33 AM0 comments
When it comes to software and services, Microsoft has always tried to offer it all, or at least as much of it as possible.
That always makes it interesting when the company acknowledges the use of a major third-party product for internal purposes in its Fortune 100-class operations.
Microsoft will be pulling back the curtain later this month on how it uses Red Hat Ansible Automation. Launched in 2013, Red Hat Ansible Automation is a tool for automation across the stack from infrastructure to networks to cloud to security for both IT operations and development.
Microsoft will be one of several Red Hat customers speaking at AnsibleFest Atlanta from Sept. 24 to 26. Other customers talking about their Ansible adoption at the show include Datacom, Energy Market Company and Surescripts.
"Adopting Red Hat Ansible Automation has not only changed how our networks are managed, but also sparked a cultural transformation within our organization," said Bart Dworak, Microsoft's software engineering manager for Network Infrastructure and Operations, in a statement. "By putting automation at the forefront of our strategy and not as an afterthought, we've been able to scale it in ways we did not know possible. Our engineers are now constantly looking for creative ways to solve their problems using Ansible Playbooks."
Microsoft turned to Ansible to improve the productivity of hundreds of engineers across 600 locations worldwide. Those engineers use Ansible for designing, building and deploying IT networks at scale, and the use of Ansible Automation has saved an estimated 3,000 work hours per year and reduced downtime.
For Microsoft, the Ansible deployment has a dogfooding element and AnsibleFest Atlanta will be an opportunity to drum up more partnership business with joint Red Hat-Microsoft customers. Microsoft's deployment of Red Hat Ansible Automation was done on top of Microsoft Azure.
Posted by Scott Bekker on 09/16/2019 at 7:47 AM0 comments
The founder of the company that created the Wunderlist app has a new item on his public to-do list: persuade Microsoft to sell him back the app.
Christian Reber was the founder and CEO of 6Wunderkinder and sold the company to Microsoft in 2015 for an estimated $100 million to $200 million.
"Still sad @Microsoft wants to shut down @Wunderlist, even though people still love and use it. I'm serious @satyanadella @marcusash, please let me buy it back. Keep the team and focus on
@MicrosoftToDo, and no one will be angry for not shutting down @Wunderlist," Reber tweeted on Friday from his @christianreber account to the official accounts of Microsoft CEO Satya Nadella and Marcus Ash, Microsoft's general manager for tasks in Berlin.
At the time of the June 2015 sale, the Berlin-based company claimed about 13 million users for the Wunderlist Pro and Wunderlist for Business software products.
Asked in a Twitter reply whether he was serious about proposing a business deal in a tweet rather than through direct communication with Microsoft executives, Reber replied, "Serious offer."
As of Monday, original Tweet had 575 retweets and 2,400 likes.
The writing was on the wall for Wunderlist in April 2017, when Microsoft released a preview of a new Microsoft To-Do Office 365 application that was to eventually include Wunderlist capabilities. At the time, Microsoft indicated that their eventual aim was to retire the Wunderlist app.
It's unclear what Microsoft's actual timeframe is for discontinuing Wunderlist. In a tweet (in German) last year after leaving Microsoft, Reber revealed that Microsoft was having technical difficulties in porting Wunderlist's back end from Amazon Web Services to Microsoft Azure.
In any event, Microsoft can now add a Twitter-based business offer to its list of tasks related to closing down Wunderlist.
Posted by Scott Bekker on 09/09/2019 at 9:27 PM0 comments
During his Monday keynote at VMworld 2019 in San Francisco, VMware CEO Pat Gelsinger provided a roadmap update for his company's partnership with Microsoft.
The two companies' joint offering, Azure VMware Solutions, enables VMware workloads to run natively on Microsoft Azure. As described by Microsoft at the late-April unveiling, the agreement allows customers to run, manage and secure applications across VMware environments and Azure with a common operating framework. Supported VMware technologies include VMware vSphere, vSAN, NSX and vCenter.
At launch, the offering was available immediately in two Azure regions -- U.S East and U.S. West. (Microsoft defines "regions" as sets of datacenters "within a latency-defined perimeter and connected through a dedicated regional low-latency network.")
On Monday, Gelsinger indicated that the offering is now available in a third Azure region, the West Europe region, with further global expansion coming soon.
"By the end of the year, we'll be in Australia and Southeast Asia," Gelsinger said.
According to a graphic displayed behind Gelsinger, the footprint of the partnership will also broaden within some existing geographies before the end of this year. The offering should be up and running in Microsoft's Azure regions in Northern Europe, U.S, West 2 and South Central, which is also in the United States.
The graphic also showed that by the end of the first quarter of 2020, the offering will be added to Azure regions in Canada and Japan. That will bring availability of the VMware offering to 10 of Microsoft's 54 announced Azure regions.
The joint solution has attracted a few major customers. "Customers are already launching into this platform today. Over 20 customers are on it," said Gelsinger, naming Lucky Brands, Dot Foods and Gap.
Posted by Scott Bekker on 08/26/2019 at 12:56 PM0 comments
If three Florida municipalities getting hit by ransomware earlier this month sounded bad, here's a Texas-sized problem for you.
The Texas Department of Information Resources (DIR) on Friday revealed that more than 20 entities, mostly smaller local governments in the state, were impacted by a ransomware attack.
"On the morning of August 16, 2019, more than 20 entities in Texas reported a ransomware attack," the Texas DIR said in an update Saturday evening that put the total number of affected agencies at 23. State government agencies were not among those affected.
The attacks seem to be coordinated. "At this time, the evidence gathered indicates the attacks came from one single threat actor. Investigations into the origin of this attack are ongoing; however, response and recovery are the priority at this time," the updated statement said.
Officials swung into action on Friday in a response that included, in addition to the DIR, the Texas Division of Emergency Management, the Texas Military Department, the Texas A&M University System's Security Operations Center/Critical Incident Response Team, the Texas Department of Public Safety, the Texas Public Utility Commission, the U.S. Department of Homeland Security, the Federal Bureau of Investigation, the Federal Emergency Management Agency, and other federal agencies.
The ransomware incidents in Texas follow a trio of incidents in Florida in Riviera Beach, Lake City and Key Biscayne. Two of those incidents involved huge ransomware payouts -- $600,000 for Riviera Beach and $460,000 for Lake City -- most of which was covered by insurance.
It is unclear whether cities are more heavily targeted for ransomware than other types of entities. On the one hand, small and local governments often have budget struggles that result in outdated IT infrastructure, and there are many documented cases of governments falling victim to attacks. On the other hand, it's easier for a company to conceal a ransomware attack. Government agencies are more accountable to public scrutiny and less able to choose to keep an incident quiet.
Posted by Scott Bekker on 08/20/2019 at 12:38 PM0 comments
To entice businesses still using Windows Server 2008 into migrating to its cloud, Microsoft is offering a big carrot: Extended Security Updates (ESU) plans at no cost.
Of course, with every carrot is a stick. In this case, the stick is the impending end of support for Windows Server 2008 on Jan. 14, 2020. Specifically, Extended Support, which includes security updates, ends that day for Windows Server 2008 Service Pack 2, Windows Server 2008 R2 SP1, Hyper-V Server 2008 and Hyper-V Server 2008 R2 SP1.
It's a wide swath of the market that's still on those server operating systems. Microsoft recently estimated that 60 percent of its server installed base, or 24 million instances, remain on Windows Server 2008 and/or SQL Server 2008, which fell out of support last month.
The other part of the stick is that organizations that want to stay on Windows Server 2008 for some reason must enter into an expensive contract for Extended Security Updates (ESU) if they want any kind of security protection, and those are only available for three years.
The carrot is that Microsoft is offering another route for customers who don't want to, or are unable to, move off of Windows Server 2008 or SQL Server 2008 right away. The carrot is a sort of half-move.
What those customers can do is migrate their instances, as they are, to Azure. Customers who rehost Windows Server 2008 and 2008 R2 workloads directly to Azure will get three full years of ESU at no additional charge. That gives them the option of upgrading from Windows Server 2008 at a more leisurely pace within those virtual machines.
They'll be paying Azure hosting fees and be in the public cloud, but they don't have to pay the ESU, so their existing operations can continue largely as-is.
It's a serious move by Microsoft to make Azure very appealing to organizations that have been at the tail end of the cloud adoption curve.
While attractive, this is only one of the options for moving from Windows Server 2008 before the deadline. For more detail on options for on-premises, hybrid and cloud migrations, check out the "Partner's Guide to the Windows Server 2008 Deadline" (registration required) at our sister site RCPmag.com.
Posted by Scott Bekker on 08/08/2019 at 10:53 AM0 comments
In an effort to get more on-premises data warehouses onto its Azure cloud, Microsoft has launched a migration initiative in partnership with Informatica for select customers.
Announced Tuesday, the offer is designed to lower the expense and risk of a proof-of-value project to determine the feasibility and advantages of moving a data warehouse to Azure.
The offer includes tools to reveal the contours of a data estate, free code conversion of existing schemas, a SQL Data Warehouse subscription for up to 30 days and on-site help.
Informatica's end involves its Informatica Enterprise Data Catalog and Informatica Intelligent Cloud Services for up to 30 days of a proof-of-value project. Informatica, an enterprise software company with a portfolio of data integration tools among other products, went private in a 2015 deal that attracted an investment from Microsoft.
In a blog post, John Chirapurath, general manager of Azure Data & AI for Microsoft, said the new program is designed to address the perceived risks involved in making a data warehouse move.
"For customers that have been tuning analytics appliances for years, such as Teradata and Netezza, it can seem overwhelming to start the journey towards the cloud. Customers have invested valuable time, skills, and personnel to achieve optimal performance from their analytics systems, which contain the most sensitive and valuable data for their business," Chirapurath wrote.
Posted by Scott Bekker on 08/06/2019 at 11:02 AM0 comments
Microsoft earned $33.72 billion in revenues for its fiscal fourth quarter, a 12% gain, as well as earnings per share of $1.37.
The company announced its latest quarterly earnings on Thursday. Its stock rose by more than 1% in after-hours trading on the results, which beat financial analysts' expectations. The earnings number was non-GAAP; the GAAP figure was higher due to a net income tax benefit of $2.6 billion for the quarter.
The most closely watched number for Wall Street this quarter was Microsoft's Azure growth metric. Microsoft reported that Azure was up by 64% compared to the year-ago period. Microsoft doesn't report Azure revenues, but the company's rate of growth has been slowing over the last few years as the company's total Azure revenues increase. For example, in the fourth quarter of 2018, Microsoft reported an Azure growth rate of 89%, and in the fourth quarter of 2017 it was 97%.
Combining Microsoft's commercial clouds by revenue did, in fact, yield a big number. "Q4 commercial cloud revenue increased 39% year-over-year to $11.0 billion, driving our strongest commercial quarter ever," Microsoft CFO Amy Hood said in a statement in Microsoft's earnings release.
For the full year, Microsoft revenues hit $125.8 billion, an increase of 14% over fiscal year 2018.
By business unit for the quarter, the Intelligent Cloud unit had the fastest growth, up 19% to $11.4 billion in revenues. That unit comprises server products and cloud services, which includes Azure, and Enterprise Services.
The Productivity and Business Processes unit grew 14% to $11 billion. That unit includes Office Commercial, Office Consumer, LinkedIn and Dynamics. Among the highlights for the unit were Office 365 Commercial revenue growth of 31%, an increase in Office 365 Consumer subscribers to 34.8 million, a 25% bump in LinkedIn revenues and a 45% gain in Dynamics 365 revenues.
The slowest-growing business unit was More Personal Computing, which reached $11.3 billion on 4% growth. Windows OEM revenue was a positive for the unit, with a 9% increase, and Surface revenues were up 14%. Gaming revenue, however, was a drag with a 10% drop.
Meanwhile, Microsoft CEO Satya Nadella took up a theme he's expressed in previous earnings calls this year -- that major customers are becoming more like partners, with roles as crucial to Microsoft as the historic OEM relationships.
Describing "deep partnerships with leading companies in every industry," Nadella said, "Every day we work alongside our customers to help them build their own digital capability -- innovating with them, creating new businesses with them, and earning their trust. This commitment to our customers' success is resulting in larger, multi-year commercial cloud agreements and growing momentum across every layer of our technology stack."
In a statement released just before Microsoft's earnings, John Dinsdale, chief analyst and research director for Synergy Research Group, called Microsoft the clear No. 2 (after Amazon Web Services) in cloud infrastructure services and a very clear market leader in the fragmented Software as a Service (SaaS) market.
For cloud infrastructure, Dinsdale noted, "[Microsoft's] revenue growth rate is way above the overall market growth rate, so it is gradually gaining market share -- 9% in 2016, 11% in 2017, 14% in 2018 and 16% in the first quarter of 2019."
Posted by Scott Bekker on 07/18/2019 at 3:00 PM0 comments
ServiceNow and Microsoft on Tuesday extended their strategic partnership in a move designed to appeal to governments and enterprises in highly regulated industries looking to move digital workflows to the cloud.
ServiceNow, based in Santa Clara, Calif., provides cloud-based platforms and solutions for delivering digital workflows. The new agreement builds on an alliance from October that allowed Microsoft's U.S. federal government customers to deploy ServiceNow technology from the Microsoft Azure Marketplace to the Azure Government Cloud.
The main component of the expanded arrangement is that ServiceNow will use Azure as a preferred, but not exclusive, cloud platform. The Azure version will include ServiceNow's "full SaaS experience," according to the announcement. Initial availability will be in Australia and Azure Government in the United States, with additional Azure regions coming later.
ServiceNow will still provide its SaaS offering on its own private cloud. The company also announced a deal in May with Google Cloud Platform (GCP) and has integrations with Amazon Web Services (AWS).
According to the announcement, ServiceNow will benefit from Azure's broad regulatory and compliance coverage, while ServiceNow's inroads with the U.S. federal government's digital transformation efforts could bring new workloads to Azure.
Microsoft and ServiceNow will also continue to partner on development of technology integration and user experience improvements for their joint customers.
In a separate transaction announced at the same time, Microsoft will use ServiceNow's IT & Employment Experience workflow products internally.
Posted by Scott Bekker on 07/09/2019 at 3:01 PM0 comments
About 10 years ago, a worm dubbed Conficker began constructing a botnet that ran away to over 10 million Windows computers. The Conficker worm is still estimated to have potential control over as many as 500,000 unpatched Windows systems, but it was never used for anything but a low-yield scareware campaign.
Over the weekend, journalist Mark Bowden provided an explanation for why that powerful botnet was abandoned. Bowden is best known for "Black Hawk Down," a book-length account of the U.S. military raid in Somalia in 1993 that was turned into a movie. In 2011, Bowden wrote a book about Conficker, called "Worm."
In an article published in The New York Times, "The Worm That Nearly Ate the Internet," Bowden provided an update on Conficker by reporting about an article from a classified journal he obtained this year.
"This explanation was detailed in an article published in December 2015 by The Journal of Sensitive Cyber Research and Engineering, a classified, peer-reviewed publication issued by a federal interagency cybersecurity working group including the Pentagon, Department of Homeland Security and N.S.A. -- and distributed to a small number of experts with the appropriate security clearances. The article itself was not classified, but reached only a small readership," Bowden wrote.
Contrary to theories that academics created Conficker as an exercise or a government developed it as a cyberweapon, Bowden contended that the journal article builds a strong case that it was the work of cybercriminals.
"While some experts still disagree, most now believe that Conficker was the work of Ukrainian cybercriminals building a platform for global theft who succeeded beyond all expectation, or desire," he wrote.
The scale of the botnet appeared to surprise the creators, and may explain why it wasn't used despite all the effort that went into creating it. "The last thing a thief wants is to draw attention to himself," Bowden writes. "Conficker's unprecedented growth drew the alarmed attention of cybersecurity experts worldwide. It became, simply, too hot to use."
The story is well worth a read for more detail on the Ukrainians and the Swede charged in the case, as well as policy and Internet security implications of Conficker a decade later.
Posted by Scott Bekker on 07/01/2019 at 3:01 PM0 comments
Azure Files premium tier has reached general availability (GA), giving users with higher performance needs the ability to access managed file services on solid-state drives in Microsoft's public cloud.
The GA announcement on Wednesday comes after the service had been available in a narrow preview since last September and a broader preview since early May.
"Premium tier is optimized to deliver consistent performance for IO-intensive workloads that require high-throughput and low latency. Premium file shares store data on the latest SSDs, making them suitable for a wide variety of workloads like databases, persistent volumes for containers, home directories, content and collaboration repositories, media and analytics, high variable and batch workloads, and enterprise applications that are performance sensitive," said Tad Brockway, corporate vice president for Azure Storage, Media and Edge, in a blog post.
Microsoft will continue to offer a standard tier of Azure Files at a lower price, with the standard tier positioned for general-purpose file storage, development, test, backups and applications that are less sensitive to latency.
In the United States, the premium tier is about four times as expensive as the standard tier per month at $0.24 per provisioned GiB rather than $0.06 per used GiB. The delta on snapshot GiB/month is slightly less, with premium going for $0.20 per used GiB, while standard is $0.06 per used GiB. (Editor's Note: The story has been updated to correct pricing. An earlier version was based on old information on Microsoft's pricing page, which was updated after the announcement.)
However, unlike with the standard tier, operations on premium files are free. That difference is reflected in the "per provisioned" (versus "per used"), which Microsoft contends makes it simpler to determine the total cost of ownership.
The premium tier pricing goes into effect on Aug. 1. A public preview discount of 50 percent will stay in force until July 31.
Brockway also said the Azure Storage team is working internally with the Azure SQL and Microsoft Power BI teams to help leverage the premium files for higher-performance solutions. "As a result, Azure Database for PostgreSQL and Azure Database for MySQL recently opened a preview of increased scale of 16 TiB databases with 20,000 IOPS powered by premium files. Microsoft Power BI announced a powerful 20 times faster enhanced dataflows compute engine preview built upon Azure Files premium tier," he said.
Posted by Scott Bekker on 06/26/2019 at 3:02 PM0 comments
The official warnings keep coming about the "BlueKeep" security vulnerability.
This week it was the turn of the Cybersecurity and Infrastructure Security Agency (CISA), a U.S. Department of Homeland Security agency that serves as the lead national government unit on civilian cybersecurity.
BlueKeep refers to a critical vulnerability in the implementation of the Remote Desktop Protocol (RDP) used by several older Windows operating systems, including Windows 2000, Windows XP, Windows Vista, Windows 7, Windows Server 2003 and Windows Server 2008. BlueKeep's Common Vulnerabilities and Exposures (CVE) identifier is CVE-2019-0708.
Microsoft disclosed the vulnerability in mid-May and took the extraordinary step of providing patches for some of the involved operating systems that have fallen out of support -- Windows XP, Windows Vista and Windows Server 2003.
Because the vulnerability is pre-authentication and requires no user interaction, Microsoft at the time warned, "The vulnerability is 'wormable', meaning that any future malware that exploits this vulnerability could propagate from vulnerable computer to vulnerable computer in a similar way as the WannaCry malware spread across the globe in 2017."
In an end-of-May blog post, the Microsoft Security Response Center repeated its warnings about the BlueKeep vulnerability in no uncertain terms. "It's been only two weeks since the fix was released and there has been no sign of a worm yet. This does not mean that we're out of the woods ... It is possible that we won't see this vulnerability incorporated into malware. But that's not the way to bet."
Earlier this month, the U.S. National Security Agency (NSA) issued a public warning of its own urging Windows administrators to apply the patch and update their systems. In the June 4 statement, the NSA wrote, "Although Microsoft has issued a patch, potentially millions of machines are still vulnerable."
Now comes the CISA warning, which also urges users and administrators to review Microsoft's advisory and "apply the appropriate mitigation measures as soon as possible." In addition to enumerating the previous concerns about the vulnerability -- such as a successful attacker's ability to add accounts with full user rights; view, change or delete data; or install programs -- CISA goes further with a discussion of its own tests.
"CISA tested BlueKeep against a Windows 2000 machine and achieved remote code execution. Windows OS versions prior to Windows 8 that are not mentioned in this Activity Alert may also be affected; however, CISA has not tested these systems," the alert states.
Attila Tomaschek, data privacy advocate at ProPrivacy.com, said the CISA warning should not be taken lightly, in part because of the agency's test. "The fact that CISA revealed that it was able to exploit BlueKeep to execute code remotely on a computer running Windows 2000 suggests that it is only a matter of time before malicious attackers are able to do the same," Tomaschek said in an e-mailed statement.
Tomaschek suggested that the CISA's critical warning indicates that authorities believe the threat of a malicious exploit with the capability to infect large numbers of vulnerable devices is imminent. "Organizations and individuals using vulnerable Windows operating systems should take heed and install Microsoft's security updates to patch the vulnerability and insulate themselves from an attack that could potentially take over their systems and compromise hordes of sensitive data," he said.
Posted by Scott Bekker on 06/19/2019 at 3:03 PM0 comments
The pending end-of-support deadline for Windows 7 is a bright spot in an otherwise gloomy forecast for 2019, contributing to what IDC is calling an "interesting year" for PC sales.
IDC on Monday released its mid-year update of the 2019 forecast for PC sales. Overall, IDC now expects to see unit shipments drop by 3 percent for the year for a total of 392.5 million units. The main challenge comes on the consumer side of the market, where shipments are expected to decline 6 percent year-over-year, as consumers spend more of their budget on replacing smartphones than PCs.
Yet IDC is projecting that the average selling prices (ASPs) for the entire market are rising 2.6 percent for the year, keeping the dollar value of the market roughly flat at $237 billion.
The ASP increase, according to an IDC statement, is being "driven by new technologies, such as thinner bezels on notebook screens that have increased demand for 2-in-1 form factors, and ongoing demand for gaming PCs. Additionally, shipments into the commercial segment are expected to provide an uplift in ASPs in 2019 as many enterprises move to replace their PCs before Microsoft ends support for Windows 7 in early 2020."
That key date of Jan. 14, 2020, when extended support for Windows 7 ends, and the other ASP-lifting factors are prompting IDC to declare that "2019 is shaping up to be an interesting year."
After 2019, maybe not so much. IDC currently expects unit shipments to decline by an average of 1.6 percent per year, hitting 367.7 million units in 2023.
Posted by Scott Bekker on 06/03/2019 at 3:03 PM0 comments
Veeam, a provider of backup and availability software for cloud data management, revealed a significant financial milestone this week during its annual VeeamON conference in Miami.
"We achieved $1 billion in revenue bookings," said Ratmir Timashev, co-founder and executive vice president for sales and marketing. Timashev said the figure was based on revenues for the trailing 12 months.
Veeam is unusual among private software companies in that it regularly and publicly shares financial performance data via press release. It's not the kind of comprehensive disclosure you'd see from a public company with net income, revenues and business unit results, but it's still a remarkable degree of transparency.
The revenue marker Timashev revealed this week trails slightly -- but only slightly -- behind his prediction in 2013 that the company would reach $1 billion in five years.
"We can blame [that] a little bit on subscription rights," Timashev said, referring to the shift in revenue models and the marketwide way in which businesses are buying software on a monthly basis rather than paying for licenses upfront.
Veeam also said this week it had 350,000 customers and was adding 4,000 net new customers per month and 50,000 per year.
Posted by Scott Bekker on 05/22/2019 at 3:07 PM0 comments
Orchestration technology may be a little ahead of where most customers stand on their availability journey, but Veeam is forging ahead with a second-generation product that could make the approach possible for more organizations and applications.
Failing over a complex environment in a disaster recovery situation is a multistep process. Processes and applications must be started in a precise order and spun up on the correct hardware or virtual machines. Orchestration solutions allow organizations to set the order that those automated steps are taken in case of need for a failover.
Veeam Availability Orchestrator v2 hit general availability on Tuesday during the VeeamON 2019 conference in Miami.
Danny Allan, vice president of product strategy for Veeam, said the flagship feature of the new version is that it now allows orchestrated business continuity from backups rather than strictly from replication environments.
"Doing it from backups means you don't have to be running 24x7 in both locations. This now democratizes orchestrated business continuity disaster recovery to the entire customer base, and not only the customer base but the whole industry," Allan said.
Allan described Veeam's vision of the cloud data management journey for customers as about a 10-year process. The first stage are backups protecting all workloads, followed by cloud mobility. Most organizations are in those two stages, Allan said. Because of General Data Protection Regulation (GDPR), companies in Europe are slightly ahead of U.S. companies in a third stage, visibility. Relatively few organizations have reached the fourth stage, orchestration, or the final stage, automation, he said.
One Veeam customer that is very interested in the automation tool is Tom Morley of ABM Industries, a large facilities management company. Morley, director of global technology operations and enterprise engineering, is an intensive user of Veeam technologies, but sees orchestration as a 2020 project after a current modernization overhaul is complete.
"As part of modernizing, our weakest spot is probably orchestration across all of our systems," Morley said. "Next year will be about orchestrating all the way down."
Veeam's v2 includes several other new features. Reporting and compliance capabilities have been enhanced to allow organizations to prove with the orchestrator that service-level agreements are being met. The tool also allows the ability to use the orchestrator for purposes aside from recovery, such as DevOps, testing and analytics. Veeam has also added role-based access control to allow for more fine-grained delegation.
Posted by Scott Bekker on 05/22/2019 at 3:06 PM0 comments
A federal computer security watchdog agency on Monday warned Office 365 users and their technology partners about common Office 365 misconfigurations.
In an analysis report titled "Microsoft Office 365 Security Observations," the Cybersecurity and Infrastructure Security Agency (CISA) described four common security misconfigurations found during a multi-month investigation begun last fall. CISA is the new standalone agency within the Department of Homeland Security that functions as the lead national government unit on civilian cybersecurity.
The investigation focused on customers who have used third-party partners to migrate their e-mail services to Office 365. The CISA report did not say how many customer environments it looked at, how widespread the problems were at those sites or what kinds of third-party partners were involved.
The conclusion, however, was stark. "The organizations that used a third party have had a mix of configurations that lowered their overall security posture (e.g., mailbox auditing disabled, unified audit log disabled, multi-factor authentication disabled on admin accounts)," the report said. "In addition, the majority of these organizations did not have a dedicated IT security team to focus on their security in the cloud. These security oversights have led to user and mailbox compromises and vulnerabilities."
The report details five Office 365 configuration problems, with some of them exposing administrator username/password prompts to attack without multifactor authentication (MFA) protections in place, others involving audit logs being left off, and another allowing attackers who had compromised on-premises accounts to move laterally into the cloud.
The main MFA problem involved organizations that didn't set up MFA for the Azure Active Directory (AD) Global Administrators in an Office 365 environment. Microsoft does not require MFA by default in creating the accounts, and many organizations don't change the setting. The report notes that the Azure AD Global Administrator accounts are the first ones created and are required to configure the tenant and migrate users. "These accounts are exposed to internet access because they are based in the cloud. If not immediately secured, these cloud-based accounts could allow an attacker to maintain persistence as a customer migrates to O365," the report warned.
A related problem flagged by CISA involves legacy protocols that don't support MFA. Those include POP3, IMAP and SMTP. While the report acknowledges that users with older e-mail clients may need these less secure protocols, efforts should be made to limit their use to specific users and to wean the organization off of those protocols as quickly as possible.
Of all the problems highlighted in the report, CISA stressed enabling MFA as a best practice: "This is the best mitigation technique to use to protect against credential theft for O365 users."
Auditing is another commonly discussed problem in Office 365 security circles. Mailbox auditing was disabled by default prior to January 2019, meaning organizations trying to investigate potential breaches often discovered they had no logs to look at if they hadn't enabled the feature. CISA urged organizations whose Office 365 configuration was set up prior to January of this year to ensure that mailbox auditing is enabled.
The analysis report also pointed to another logging feature which is still disabled by default -- the unified audit log. That log records events from several Office 365 services, including Exchange Online, SharePoint Online, OneDrive, Azure AD, Microsoft Teams and Power BI. Administrators can enable the unified audit log in the Security and Compliance Center.
Another configuration choice that can lead to security problems involves password sync using Azure AD Connect, the report states. A useful migration tool designed to create Azure AD identities from on-premises identities or to match previously created Azure AD identities with on-premises AD identities, Azure AD Connect can cause security problems in certain cases.
Posted by Scott Bekker on 05/13/2019 at 3:07 PM0 comments
VMware customers will be able to extend their VMware infrastructure investments to the Microsoft Azure cloud as part of an expansive partnership announced Monday that follows a similar VMware-Amazon Web Services deal from a few years ago.
CEOs of Microsoft, VMware and VMware majority owner Dell Technologies Inc. announced the deal at Dell Technologies World in Las Vegas.
The arrangement follows a controversial recent effort by Microsoft to conduct its own implementation of a VMware technology integration for Azure in a way that was not supported by VMware.
Other parts of the deal include support for managing Office 365 across devices via VMware Workspace ONE, integration by VMware of support for Microsoft's forthcoming Windows Virtual Desktop (WVD), and future work on networking and on delivery of Azure services for VMware on-premises customers.
In a statement, Microsoft CEO Satya Nadella positioned the deal as part of Microsoft's recent pattern of working closely with sometimes bitter, or at least partial, competitors to advance common customer interests. "At Microsoft, we're focused on empowering customers in their digital transformation journey, through partnerships that enable them to take advantage of the Microsoft Cloud, using the technologies they already have," Nadella said.
Scott Guthrie, executive vice president for Microsoft's Cloud and Enterprise Group, expanded on the theme in a blog post, putting the VMware deal in a line of agreements that includes SAP, Red Hat, Adobe and Citrix.
Called Azure VMware Solutions, the main element of the deal is technology built on VMware Cloud Foundation to run VMware workloads natively on Azure. "Customers can now seamlessly run, manage and secure applications across VMware environments and Microsoft Azure with a common operating framework," Guthrie wrote in his blog post. "Customers will be able to capitalize on their existing VMware investments, skills and tools, including VMware vSphere, vSAN, NSX and vCenter while leveraging the scale, performance and innovation of Azure."
In addition to giving customers the ability to manage on-premises and Azure clouds from within their current set of VMware tools, the two companies position the integration as a strong solution for application migration and modernization, datacenter resizings and disaster recovery/business continuity.
Azure VMware Solutions is available immediately in two Azure regions -- U.S. East and U.S. West -- with availability in the West Europe region coming shortly, according to a Microsoft FAQ. While it is sold by Microsoft, backed by the Azure service-level agreement and supported by Microsoft and VMware, it was developed in collaboration with VMware-certified partner CloudSimple. Additionally, a second version is being developed for release later this year by Virtustream, a Dell subsidiary.
The other immediate piece of the partnership will allow VMware Workspace ONE customers to manage Office 365 on devices using VMware's toolset. On stage Monday, VMware CEO Pat Gelsinger described the arrangement as ending a dilemma for customers. "We've solved this battle that we've been having -- is it going to be a Workspace ONE device or a Microsoft Intune device? Gone," Gelsinger said. He said Workspace ONE would have best-in-class support for Office 365, Microsoft 365, Windows 10 and Azure Active Directory.
Also getting "first-class citizen" status within VMware infrastructure will be WVD, Gelsinger said. WVD is currently a Microsoft public preview for a service that delivers a multisession Windows 10 experience, optimizations for Office 365 ProPlus and support for Windows Server Remote Desktop Services (RDS) desktops and apps. VMware will extend the capabilities of WVD through VMware Horizon Cloud on Microsoft Azure. A tech preview is expected by the end of this calendar year.
Longer-term, the companies are exploring integrations between VMware NSX with Azure Networking and exploring bringing specific Azure services to VMware on-premises customers. No specific timeframe was immediately available for those efforts.
Posted by Scott Bekker on 04/29/2019 at 3:09 PM0 comments
Ongoing strength in its cloud business and a recovery on the Windows side helped power a strong third quarter for Microsoft, according to the company's latest financial results.
In results released after markets closed Wednesday, Microsoft reported earnings of $30.6 billion, an increase of 14% over the year-ago quarter and well ahead of analysts' expectations. Other headline figures included a 25% gain in operating income to $10.3 billion, a 19% gain in net income to $8.8 billion and a 20% increase in diluted earnings per share to $1.14.
CEO Satya Nadella pointed to the customer demand for Microsoft's constantly evolving cloud services as a key factor for the reporting period, Microsoft's third quarter, which ended March 31. The company pegged commercial cloud revenues at $9.6 billion for the quarter. That's a 41% jump year-over-year on an already large figure.
Inside those cloud revenues, Microsoft's strategic Azure cloud computing platform was a key growth driver. Microsoft reported 73% revenue growth for Azure. Office 365 Commercial revenue also continued to plow ahead, with 30% revenue growth. On the consumer side, Office 365 Consumer subscribers increased to 34.2 million.
One other business growth area for cloud was Dynamics 365, Microsoft's cloud platform for its ERP, CRM and other business applications. Dynamics 365 revenues increased 43% compared to the year-ago quarter.
Last quarter, the Intel chip shortage was a problem for Microsoft, with Chief Financial Officer Amy Hood at the time attributing a smaller overall PC market to the timing of chip supply to Microsoft's OEM partners. While in Q2 Windows OEM Pro revenue dropped by 2% and non-Pro revenue fell 11%, no such problems existed in Q3. Microsoft reported Wednesday that Q3 Windows OEM revenues were up 9% year-over-year. Revenues for Microsoft's own Surface products, meanwhile, were up 21% in the quarter.
In other highlights:
- LinkedIn, which Microsoft purchased in 2016 for $26 billion, continued to perform well in the third quarter, with revenue increasing 27%.
- Enterprise Services revenues increased 4%.
- Gaming revenue was up 4%.
- Search revenue increased 12%, excluding traffic acquisition costs.
Posted by Scott Bekker on 04/24/2019 at 3:09 PM0 comments
Have you had any problems with Azure consumption overages?
It turns out that Microsoft is counting on customers to end up paying more for Azure than they may have planned to.
During Microsoft's last earnings call in January, CTO Amy Hood highlighted Azure consumption overages as a source of growth for the company.
"As a reminder, strong performance in larger, long-term Azure contracts, Azure consumption overages, and pay-as-you-go contracts will drive bookings growth and in-period revenue but will have a limited impact on unearned revenue," Hood said during the call.
There wasn't a direct dollar figure attached, and Hood's comment downplays the total a bit. But when you're a $110 billion revenue company, any amount of money that's worth bringing up in a half-hour call with investors qualifies as a significant sum.
We'd like to hear your stories about Azure consumption overages. How did the overage happen in your case? How much did it cost you? How did you address the problem and have you been able to contain it since? Drop us a note at firstname.lastname@example.org.
Posted by Scott Bekker on 04/10/2019 at 3:09 PM0 comments
A new survey finds a dangerous gap between organizations' perceptions and actions when it comes to Office 365 compliance and security.
In short, those who believe Microsoft is doing a good job with security and compliance may not be taking the baseline steps required to ensure their environments are safe and in compliance -- in other words, they may not be doing the basic things that Microsoft's tools rely on to help ensure protection.
And those who don't believe Microsoft protections are enough tended not to be aware of all the steps Microsoft takes on their behalf.
The results come from "Organizational Security & Compliance Practices in Office 365," a 37-page report conducted by CollabTalk LLC and the Marriott School of Business at Brigham Young University and commissioned by Spanning Cloud Apps, RecordPoint, tyGraph, Rencore and Microsoft. (Redmondmag.com is an in-kind sponsor of the research.) The report, released this week, is based on surveys of more than 270 IT professionals, executives and managers across 19 industries, and includes commentary from several Microsoft Most Valuable Professionals (MVPs) and experts.
Specifically, the report said that:
- Of those that thought Microsoft security was sufficient, 80% of respondents have either not run security and compliance checks, or do not know if they have.
- Of those who did not think the current security protection offered by Microsoft was sufficient, 57% of respondents were not aware of Microsoft's security division.
- Of those who did not think the current security protection offered by Microsoft was sufficient, 71% of respondents were not aware of Microsoft's overall security and compliance strategy.
One of the MVP commenters, Matthew McDermott, lays responsibility for this gap squarely on the organizations, which are themselves struggling to keep abreast of the many administrative tools, settings and options within Office 365 components and dealing with hybrid environments that involve many more platforms than just Office 365.
"The gap presented in this research is not from a lack of features, vision or direction from Microsoft; the gap comes from within organizations," said McDermott, Spanning's principal technical marketing engineer and the Conference Chair for Office & SharePoint Live!, an event run by Redmondmag.com's parent company, in a statement about the report. "Companies must invest in personnel and tools to ensure compliance and secure systems. It's not enough, with today's threat landscape, to be reactive. You need to be proactive in your approach to keeping your assets and customer data safe and secure."
Another of the MVPs, Erica Toelle, product evangelist at RecordPoint, portrayed the gap as a painful step in a journey toward a better overall situation on security and compliance. "Before the cloud, people managed security and compliance all on their own. Outsourcing this to Microsoft is a good idea. Microsoft has more budget to hire the industry-leaders, so they are more secure. People don't perceive this because their understanding is immature. They don't know how much Microsoft is protecting them or not. They also don't really have complete control over the situation," Toelle stated in the report's conclusion.
Recommendations in the report include approaching security and compliance more holistically, identifying feature gaps and creating an operational strategy for addressing them, conducting inventory audits, creating training plans, developing governance and change management programs and committees, and setting up pilot programs to understand the latest features and capabilities of Office 365.
The report is available from the Spanning Web site here.
Posted by Scott Bekker on 03/29/2019 at 3:10 PM0 comments
One of the biggest deadlines in the history of IT will be only 300 days away later this week.
We're talking about the deadline for extended support for Windows 7. Once Jan. 14, 2020 arrives, Microsoft will stop sending out free security updates for the operating system, and other types of support will also be shut down.
It's a key deadline because the OS in many ways was Microsoft's most popular operating system to date -- beating out Windows XP's market share after a struggle, Windows Vista's without any contest and its immediate successor Windows 8/8.1's pretty handily.
Windows 10 has been taking over from Windows 7, more slowly than Microsoft publicly stated it wanted, and with fits and starts, but it's getting there.
If you're one of the organizations that's still working your way free of Windows 7 -- and take comfort in the fact that you're far from alone -- the editors of Redmondmag.com have put together a series of resources to help.
Check out our "Ultimate Guide to Windows 7 Migrations." This free guide brings a lot of what you need to know into a single 22-page PDF.
The guide includes:
- A summary of key dates, not just for Windows 7 but for other critical products that are nearing their support deadlines.
- Discussions of options for methods to migrate to Windows 10.
- An emergency section describing what to do if some or all of your desktops won't make the upgrade deadline.
- Details on the way the new Windows Virtual Desktop on Azure fits into the mix.
- Exclusive survey research about migration progress and deployment plans from our research arm, Redmond Intelligence.
- Key information about what to look for in client systems if you're pairing your Windows 10 migration with new hardware.
Click here to download this valuable resource and get your Windows 7 migration project on track (registration required).
Posted by Scott Bekker on 03/18/2019 at 3:10 PM0 comments
Just days before the 2019 RSA Conference, Microsoft on Thursday announced the preview releases of two new cloud-based security services: Azure Sentinel and Threat Experts.
Azure Sentinel is a native security information and event management (SIEM) tool that runs in Microsoft's public cloud. Ann Johnson, corporate vice president for Cybersecurity Solutions at Microsoft, touted Azure Sentinel as "the first cloud-native SIEM within a major cloud platform" during a media briefing on Wednesday.
Johnson said Sentinel was built from scratch with the help of industry partners as a modern security tool to collect, parse and present security data from users, devices, applications and infrastructure, both on-premises and in the cloud. Like many of Microsoft's current initiatives, key selling points are the flexible and scalable nature of having the solution running in the cloud and the ability to leverage Microsoft's artificial intelligence (AI) infrastructure and expertise.
At the same time, Microsoft also championed the tool's potential to cut both administrative burdens of on-premises SIEM approaches and the time wasted on inconsequential SIEM alerts.
"I don't need to have people maintaining infrastructure, patching, dealing with upgrades, things like that. I've just got my people focused on finding threats," said Eric Doerr, general manager of the Microsoft Security Response Center (MSRC), in a video about the MSRC's dogfooding of Azure Sentinel.
Johnson put the alerts in the context of the IT security skills gap. "The cybersecurity landscape is at a point where the attackers do have an advantage due to a lack of skilled cyberdefenders. With an estimated shortfall of over 3 million security professionals by 2021, there simply are not enough defenders to keep pace with the growing profit opportunity that cybercrime offers," she said. "Existing defenders are overwhelmed by threats and alerts. They often spend their days chasing down false alarms instead of doing what they do best, investigating and solving complex cases."
Microsoft contends that its machine learning (ML) algorithms and knowledge from handling trillions of signals each day inform the Sentinel tool.
Pricing has not been set for Azure Sentinel. The preview is free and licensed Office 365 customers will be able to import data into the tool for free as an ongoing feature once the service is generally available.
The other preview, Threat Experts, is a high-end, "managed threat hunting service" within Windows Defender Advanced Threat Protection (ATP) that's aimed at security operations centers. The intent is again to use Microsoft's expertise, AI/ML resources and massive global signals collection to provide context around security alerts that could help organizations find, prioritize and respond to security problems. The service consists of attack notifications that are supposed to be tailored to an organization's needs and the availability of Microsoft experts who can be engaged on demand.
"Not every organization has access to the level of human expertise they need. Microsoft is now offering our security experts as an extension of our customers teams," Johnson said. "Experts provide the insights our customers need to get additional clarification on alerts, including root cause or scope of an incident, suspicious machine behavior and next steps if faced with an advanced attacker. They can also help determine risk and protection regarding threat actors campaigns or emerging attacker techniques."
Although the new Threat Experts service is also in preview, customers will already need to have Windows Defender ATP to access it. The Windows Defender ATP platform is a toolbox of prevention, detection, investigation and response tools for enterprises. Threat Experts joins elements like attack surface reduction, endpoint detection and response, automated investigation and remediation, Secure Score and advanced hunting tools. Windows Defender ATP is available only in Microsoft's most expensive licensing packages, such as Windows 10 Enterprise E5 and Microsoft 365 E5.
Posted by Scott Bekker on 02/28/2019 at 3:12 PM0 comments
A partnership is in the works between Microsoft and VMware to make it much easier for enterprise customers to move VMware workloads into the Azure public cloud, according to a published report.
Citing unnamed sources, subscription technology site The Information on Tuesday reported that the historical rivals are jointly working on the integration and could be within weeks of an announcement. The article attributed the information to "a person with direct knowledge of the project and six others who have been briefed on it."
The arrangement reportedly involves VMware's server virtualization software. Neither Microsoft nor VMware are commenting. If true, it represents another example of Microsoft setting aside long-running competitive fights -- in this case, between VMware hypervisor technology and Microsoft's own Hyper-V offerings -- in favor of attracting increased workloads to the Azure public cloud and positioning Microsoft better for the more strategic fight with its larger public cloud rival, Amazon Web Services (AWS).
VMware already has a similar arrangement with AWS.
"While it is already possible to move computing jobs running on VMware inside private data centers to Azure, it requires extensive technical work. The new software Microsoft and VMware are developing aims to significantly speed up this process, making it cheaper for them to accomplish," the article by Kevin McLaughlin stated.
The deal follows a controversial move by Microsoft in November 2017 to build software on its own to allow VMware computing jobs to run on Azure, a move that VMware challenged at the time.
One interesting personnel detail in the article that supports the idea that the companies are working together is the presence of Ray Blanchard, identified as the former VMware executive in charge of the partnership with AWS. Blanchard joined Microsoft a year ago.
Posted by Scott Bekker on 02/27/2019 at 3:12 PM0 comments
From the beginning, Microsoft's vision for the Azure Stack involved situations where you're getting your hands dirty.
The Azure Stack is supposed to bring much of the power of Azure cloud computing out to the edge, where users can run full artificial intelligence (AI) or other processing-intensive workloads without waiting to connect to the cloud.
Use cases cited by Microsoft CEO Satya Nadella in 2018 included early adopter Chevron deploying Azure Stack on oil rigs, for example. A demo video last year featured Scott Montgomery, a senior industry solutions manager at Microsoft, driving around in a one-ton Chevrolet Suburban decorated with the Microsoft logo and loaded with an Azure Stack in the cargo area (pictured above). The point was to highlight disaster relief scenarios, remote power line inspections with a drone and other scenarios.
Yet the first implementations of Azure Stack, which is sold as a complete hardware and software solution by a handful of OEM partners, were primarily designed for the standard datacenter, which is an exceptionally clean room in most cases. That was a good place to start as many of the less-photogenic implementations of Azure Stack call for data processing at a branch office or a remote facility that doesn't require the server kit to be mobile once it's installed.
This week brought the next logical step in the evolution of the systems with the unveiling of the Dell EMC Tactical Microsoft Azure Stack. You could think of it as the Toughbook of Azure Stacks -- or as Dell would probably prefer, the Dell Latitude Rugged of Azure Stacks.
"Tactical Azure Stack is the first and only ruggedized Azure Stack product available for tactical edge deployments," wrote Paul Galjan, senior director of Microsoft Hybrid Cloud at Dell EMC, in a blog post announcing the system, which is expected to be available this quarter in the United States.
Unlike a laptop, the Azure Stack is a two-person lift at 380 pounds. That weight is light enough to qualify the system as fully mobile or highly portable given that it can be moved by two people. It's also a reasonable weight considering the 41.5" high and 25.6" deep box includes all the servers, storage and networking gear needed to run the Azure software. There's an option to use additional "core" transit cases to go up to the full node limits of Azure Stack.
"The Tactical Microsoft Azure Stack unlocks a wide variety of use cases for government, military, energy and mining applications," Galjan said. "It can also be ideal in forward deployments and mobile environments in marine, aerospace and other conditions that require MIL-STD 810G compliance."
Also this week, Microsoft, which integrated the Azure Stack with the Azure Government cloud last year, unveiled new Azure Data Box products for Azure Government. The on-premises appliances include the Azure Data Box Edge, available now in preview; the Azure Data Box Gateway and Azure Data Box, both available in March; and the Azure Data Box Heavy, set for availability in the middle of the year.
Posted by Scott Bekker on 02/06/2019 at 3:12 PM0 comments
Microsoft reported second quarter revenues on Wednesday that slightly missed financial analysts' targets, but that reflected double-digit growth in strategic business units.
Revenue for the quarter ended Dec. 31 was $32.47 billion, a gain of 12% over the year-ago period, and below analyst expectations of $32.51 billion. Diluted earnings per share were slightly higher than what Wall Street expected, coming in at $1.10 non-GAAP against predictions of $1.09.
By the company's three overarching business segments, growth was strongest in Intelligent Cloud, followed by Productivity and Business Processes, with More Personal Computing bringing up the rear. Intelligent Cloud revenues rose 20% in the quarter to $9.4 billion. Productivity and Business Processes was up 13% to $10.1 billion, and More Personal Computing delivered single-digit growth of 7% to $13 billion.
In a statement, Chief Financial Officer Amy Hood drew attention to revenue growth in Microsoft's commercial cloud category, which crosses boundaries between the business segments. "Our solid execution delivered another strong quarter, with commercial cloud revenue growing 48% year-over-year to $9.0 billion," Hood said.
In a quarter when some infrastructure players like Intel have struggled, Microsoft's best data point of the quarter came in its Azure line of cloud infrastructure products. The Azure business, which competes against market leader Amazon Web Services (AWS) and other players including Google Cloud Platform (GCP), increased in the quarter by 76%. The growth percentage is sequentially flat for Microsoft, but the high-double-digit result still demonstrates strong momentum.
The numbers also suggest that the massive acquisition of LinkedIn is going well. Revenues for LinkedIn are up 29%, and sessions growth for the work-based social media platform is up by 30%.
A new batch of Surface hardware devices unveiled last October provided a bounce in Surface revenues for the quarter of 39%. Those devices that were all shipping for the holidays included the Surface Pro 6, Surface Laptop 2, Surface Studio 2 and Surface Headphones.
Other highlights included 24% growth in server products, a category that includes and was mostly driven by the Azure performance; and 13% growth in Windows commercial products and cloud services.
The biggest headwind by far in the quarterly results related to desktop Windows, once the crown jewel of the company and now something of a drag as the company transitions to cloud. Windows OEM revenues dropped 5% in Microsoft's second quarter compared to the same October-to-December period in 2017. Also weak was Office Consumer products and cloud services, which grew but only by 1%.
Shares of Microsoft fell by 3% in after-hours trading once it released its results Wednesday.
Mark Sami, vice president of Microsoft and Cloud Solutions at SPR, a Chicago-based Microsoft managed partner specializing in digital transformation projects, contends the stock may have already dropped too far.
"An initial look at the numbers indicates the drop in earnings was a miss on the personal computing side, though growth still looks to be strong in the productivity cloud and infrastructure cloud offerings," Sami said in an e-mail. "This seems to be an overreaction, as the miss on the desktop side is minimal and many Microsoft users will have to upgrade their Windows 7 environments that will be out of support after this year. This is going to drive a lot of revenue to this sector of the business, as well as potentially boost the productivity cloud numbers because of the new way licenses are packaged."
Sharing the view that Microsoft's quarterly results look especially strong on the cloud side is Ryan Duguid, chief evangelist at Nintex, a process management and automation specialist and close Microsoft partner.
"Microsoft has certainly enjoyed an impressive run over the last 5+ years, but to be honest, I think they're only just getting warmed up," Duguid said by e-mail. "Having successfully transitioned away from a dependency on Windows and a perpetual license model across all core franchises, Microsoft has now positioned itself as the dominant player in the cloud, both in terms of core compute power with Azure, as well as application delivery through Office 365."
Posted by Scott Bekker on 01/30/2019 at 3:13 PM0 comments
Windows 10 now has a larger market share than any other desktop operating system version, including the previous king of all desktop OS versions, Windows 7.
Net Applications noted that major IT industry milestone in its December 2018 market share figures. With 39.22 percent market share, Windows 10 has a narrow but solid lead over Windows 7 at 36.9 percent. It brings Windows 10 on top for both of the most frequently cited platform trackers (Windows 10 took the lead with Statcounter in January 2018).
Windows 10's now-undisputed lead in both major trackers tells us several things about the state of IT infrastructure.
First, it tells us that Microsoft still has enough weight in the industry to dictate an OS shift. That seems like a fairly obvious point, but Microsoft did ask a lot with Windows 10, especially with the new and confusing update model. Things Microsoft had going for it included the inertia of an industry accustomed to moving to the next Windows OS every few years and that limited-time free upgrade offer.
Even with those advantages, success wasn't a foregone conclusion. Uptake of Windows 10 has been slower and at a smaller scale than Microsoft had publicly hoped for. Three-and-a-half years after launching, Windows 10 is on about 700 million machines. That falls short of the 1 billion systems Microsoft had predicted Windows 10 would power in slightly less time, but it's still impressive. What's also impressive is that Microsoft managed that progress at the same time that it has been distancing itself from its long-held identity as a Windows company.
Of course, there's also the stick. Windows 7 hits the end of extended support one year from now. Look for Windows 10's share to ratchet up steadily as companies and consumers race to meet the support deadline, or at least convert as shortly after it passes as they can.
Getting Windows 10 to the top spot underscores something else: Microsoft continues to dominate an important piece of technology real estate. The PC is certainly not the prize that it was 10 years ago. Credit for that goes to the smartphone, the mobile app ecosystem, constantly improving wireless data coverage and speeds, and cloud-based applications, among other things. Yet the PC is still the platform where most day-to-day productive work gets done. The smartphone eats into it, the tablet takes a piece, but for the most part the form factors are finding their niches and the PC fills a critical one.
Tasks at which the PC remains the ideal platform include working with words, numbers or code on a big screen with a full-size keyboard, multitasking, copying and pasting across applications and storing files for offline access. Desktops and laptops are still a massive market (remember that 700 million figure above?). Predictions that Linux would take off on the PC remain largely unfulfilled. Mac continues to gain a few percentage points here and there, but there's been nothing like a large drop off in Windows usage.
What may be most important about this latest desktop share milestone, though, is that it could be the last shift of this type. Windows OS migrations have been a staple project in the IT industry for decades -- Windows 95 to Windows 98, Windows 98 to Windows 2000, Windows 98 to Windows XP, and on and on and on. The project has come up like clockwork every three or four years. Windows 10 was famously called "the last version of Windows" by Microsoft developer evangelist Jerry Nixon. A better way to think of it may be as the "forever version of Windows."
The idea with Windows 10 is it is constantly updated, so versions go out of support every 18 months, but keeping current with the updates will push those support dates back indefinitely. Migrations for the most part will be due to hardware refresh cycles, not Microsoft support deadlines. Admittedly, it's more complicated than that with the Long Term Servicing Channel and Software Assurance timelines and other licensing and support wrinkles. There will be kinks that arise with the updates and the rings where application compatibility will be an issue, but they'll largely be one-off situations.
A relic, thankfully, is the industrywide, all-hands-on-deck situations of the old Windows update cycle with ISVs and OEMs all creating new versions of their PCs, applications and drivers, and partners and IT departments testing them all out at once and trying to get them fixed in the first service pack. Another upside could be a more secure Internet, where aging security flaws can't continuously be exploited because connected consumer machines are automatically updated for free, reducing everyone's risk.
For the 39.22 percent or so of users at home and in organizations who have migrated to the forever OS, congratulations. All of that migration drama is behind you. If you're in the process of a migration project or planning one, take heart -- this should be the last of its kind.
A lot of challenging IT projects remain. The forced update from one soon-to-be-unsupported OS to the next one with its own ticking support clock won't be one of them. Instead, partners and IT departments can focus on higher-value efforts like server migrations to the cloud, digital transformation projects and creating great business applications.
The end of the great Windows migration is in sight. In the wake of that mainstay IT project is a more stable, more secure PC with a smaller, but still important, role.
Posted by Scott Bekker on 01/07/2019 at 3:14 PM0 comments
In case you hadn't noticed, server sales are booming.
Market researchers at IDC reported Tuesday night that the third quarter represented the highest total revenue in a single quarter for servers ever.
For those of you looking around at much emptier server rooms than you might remember from a decade ago, before the financial crisis and other factors pushed the computer hardware market sideways, it's clearly not the same. As they say, the cloud is just someone else's datacenter, and those someone elses are loading up on hardware.
By the numbers, the server market soared year over year in the third quarter by 38 percent in revenues to $23.4 billion and by 18 percent in shipments to 3.2 million units.
It's the fifth consecutive quarter of double-digit revenue growth, according to IDC.
"The worldwide server market once again generated strong revenue and unit shipment growth due to an ongoing enterprise refresh cycle and continued demand from cloud service providers," said Sebastian Lagana, research manager for Infrastructure Platforms and Technologies at IDC, in a statement. "Enterprise infrastructure requirements from resource intensive next-generation applications support increasingly rich configurations, ensuring average selling prices (ASPs) remain elevated against the year-ago quarter. At the same time, hyperscalers continue to upgrade and expand their datacenter capabilities."
The increases reach across the board -- with volume server revenues up 40 percent to $20 billion, midrange revenue up 39 percent to $2 billion, and high-end systems up 7 percent to $1.3 billion. Dell led the quarter both in revenue and unit shipments, followed in revenues by HPE/New H3C Group, Inspur, Lenovo, IBM and Huawei in a tie, and Cisco. Dell, Inspur, Lenovo and Huawei are up; HPE, IBM and Cisco are down.
But as interesting as the slight jockeying for position among those enterprise vendors may be, it's the largely anonymous manufacturers who are making all the servers powering the hyperscale datacenters that create the Amazon Web Services, Microsoft, Google, Facebook and other clouds that are driving the steadiest growth.
IDC labels those vendors as ODM Direct, for original direct manufacturers who design specifically for a high-scale end customer's specific datacenter needs. Think, for example, about how particular Microsoft is about the system requirements in a modular Azure datacenter. It's not interested in off-the-shelf servers.
That group of ODM Direct vendors accounted for $6.3 billion in collective revenue, a gain of 52 percent year over year, and collectively above Dell's individual $4 billion in revenues.
Another rough way to think about this booming server market is that about one in four new servers are bound for the cloud.
Posted by Scott Bekker on 12/12/2018 at 10:25 AM0 comments
Microsoft Teams is riding the Office 365 rocket to market share among collaborative chat applications, a new survey suggests.
A Spiceworks survey of its community of IT professionals saw Teams surge seven times in usage share over two years, with the 900 respondents in North America and EMEA projecting another doubling of usage over the next two years.
"The sudden rise of Microsoft Teams is likely influenced by the fact that it's available at no additional cost to Office 365 users," said Peter Tsai, senior technology analyst at Spiceworks, in a statement accompanying the survey results Monday.
According to the Spiceworks survey, Microsoft's own Skype for Business has the biggest share at 44 percent. However, Slack-versus-Teams is the high-profile battleground, and in just two years since its launch, Teams has vaulted ahead according to the survey. Teams is now at 21 percent share, up from 3 percent in 2016. Slack is at 15 percent, up from 13 percent two years ago.
Looking ahead to the end of 2020, 53 percent of users expect to be using Skype for Business, 41 percent expect to be using Teams, 18 percent expect to be using Slack and 12 percent expect to be using Google Hangouts.
"Although Skype for Business has maintained the lead overall, Microsoft is putting more of an emphasis on Microsoft Teams as the default communications app for Office 365, which is enticing organizations to give it a try. As a result, we'll likely see Teams adoption rates double in the next couple years," Tsai said.
The survey found that overall usage of chat apps is increasing among businesses, with usage up 20 percentage points to 62 percent this year compared to 2016. At the same time, the expectation that chat apps will supplant e-mail is down among IT pros to 16 percent from 25 percent two years ago.
As they shift to Teams, organizations seem to be trading innovation and usability for security. Survey respondents found Slack the most innovative. They ranked Teams fourth for reliability, compatibility and user-friendliness, behind Skype for Business, Slack and Google Hangouts, respectively. But Teams was viewed as the leader for security, manageability and cost-effectiveness.
Posted by Scott Bekker on 12/10/2018 at 12:01 PM0 comments
For the emerging area of Internet of Things (IoT), developers face a confusing array of choices in a few different areas within Microsoft's catalog of Azure services.
To that end, Microsoft Azure MVP and Microsoft Regional Director Eric Boyd offered some guidance on a couple of key architectural questions this week, as part of a session at the Live! 360 conference in Orlando.
Boyd, the founder and CEO of responsiveX, has spent time the last few years experimenting with a burgeoning collection of IoT devices and components in his home and with the ways he can use Azure services to light up and connect the devices.
From watching his enthusiasm during a running demo throughout his presentation with a Raspberry Pi, you could tell he's been doing it partly because it's fun. The larger purpose has been getting to know the technology so well that he can help his clients figure out how to implement IoT in meaningful ways.
"What the IoT is all about is not tinkering and building the Raspberry Pi. It's about taking all the everyday things in our life and connecting them," Boyd said. "IoT is the new norm. This is just like Web and mobile. It is just the way now. It's certainly something that a lot of you should be thinking about as you look out at devices on your factory floor or agricultural scenarios."
Connecting those devices is where Azure comes in, and the services can be overwhelming. For example, when it comes to messaging, a developer might be confused by the options of Service Bus, Event Hubs or IoT Hub. All can be, and have been, used in IoT solutions. Boyd offered a succinct overview in his session.
"OK, there are all these messaging services in Azure. When do I use which service?" Boyd asked. "IoT Hub is built on Event Hubs. If you don't have a scenario where you have devices -- and I use that term loosely because that can mean a lot of things, but if you have applications where you're wanting to stream data in -- Event Hubs is the better solution for you. If you have devices, then IoT Hub is the right fit. We did IoT before IoT Hub in Azure using things like Service Bus. We built a massive kiosk network in Azure that you guys have all been customers of. But IoT Hub simplifies things [for IoT scenarios]."
Boyd also offered a way to think about the difference between IoT Central and IoT solution accelerators, two different services in the Azure catalog both intended for developers getting started with IoT. Both can get you up and running quickly, but IoT Central is more limiting. "It probably isn't your long-term strategy," Boyd explained.
"Azure IoT Central is a SaaS service. You can think of it like Office 365 for IoT. You can just go spin up a service really quickly without having to think about code. It's not a bad service. If you want to just kick the tires and prove something out and demo it to your executive group, it's great for that. It may be a good service to go pilot some things, as well," he said.
You can also get off to a quick start with the IoT solution accelerators, but those are much better as a starting point for an enterprise solution, he explained. The accelerators automatically spin up various IoT-related services for canonical, pre-built scenarios, including remote monitoring, connected factory, predictive maintenance or device simulation.
"This looks similar, but it's not the same," Boyd said of the accelerators in comparison to IoT Central. "You can modify it, redeploy it. The code for this dashboard, unlike IoT Central, is available to you, so you can tweak and customize it however you need it."
Posted by Scott Bekker on 12/06/2018 at 3:56 PM0 comments
Microsoft's artificial intelligence (AI) capabilities are popping up all over the stack, sometimes in surprising places.
Pranav Rastogi is one of the people inside Microsoft helping drive those capabilities and technologies across Microsoft's vast array of products. In the keynote for the inaugural Artificial Intelligence Live! track at the Live! 360 conference on Tuesday, Rastogi provided attendees with an overview of what those technologies are and where they're starting to emerge in products.
"The idea here is really to democratize AI for each and every employee so that it's available and employees can use it to transform their own businesses," Rastogi, a program manager at Microsoft, told an audience of several hundred attendees at the Orlando conference.
During his hour-long talk, Rastogi provided a tour of AI technologies that can immediately be leveraged by developers, end user and business analysts. To date, AI has mostly been the domain of data scientists. Rastogi's discussion dealt with the other user profiles who may not think of themselves as potential users of AI right now. An example was a slide labeled, "Introducing the Citizen Data Scientist."
All of the AI technologies he highlighted fit into a bucket that Data Relish Ltd. Principal Jen Stirrup, another speaker at the conference, described Tuesday as the types of machine learning capabilities that are commonly coming online right now -- training computers to do a single task at roughly a human level of proficiency. That's as opposed to the strong AI of self-directed fictional scenarios like R2-D2 in "Star Wars," Skynet in "The Terminator" or HAL 9000 in "2001."
For Microsoft, the AI democratization journey has three phases. First is infusing every Microsoft application with some AI capabilities so that early adopter customers can leverage the technologies if they're looking for them. The second phase involves bringing AI to every business process, which would mean driving adoption among users both through increased ease of use and raising awareness of the vertical and horizontal benefits of using Microsoft's tools. The final phase is getting every employee at all of Microsoft's customers using the AI capabilities in some way.
The "every application" phase is in the early stages but spreading quickly across many products, making the effort already broad, if not particularly deep. As an example, Rastogi showed how Microsoft is redefining existing applications with AI using the pre-built AI services, such as Vision, Speech, Language and Search. Those capabilities are being used to create new conversational experiences inside other applications like Microsoft's own Cortana, Office and Skype, as well as other applications like Slack, Facebook Messenger and Kik Messenger.
Rastogi also showed how dense the company's flagship AI platform, Azure, is getting with machine learning capabilities. At the first level are the sophisticated pre-trained models that are ready to be called from within other applications, such as the Vision, Speech, Language and Search services mentioned earlier. The lengthy list of Azure services also includes a few designed to help data science and development teams, such as Azure DataBricks, Azure Machine Learning and Machine Learning VMs. Additionally, Rastogi highlighted the Azure options for using AI-optimized hardware in Microsoft's datacenters, and for having the compute performed in the cloud, on-premises or at the edge.
The product set where the "AI everywhere" story appears strongest is in Power BI, Microsoft's business intelligence platform for accessing, manipulating and visualizing data. A product that essentially aimed to democratize BI is now evolving to do the same for AI, as well. There are capabilities for data scientists, certainly, including Power Query integration for Azure Machine Learning and integrations with Azure frameworks. Data scientists and BI professionals can also script in R or Python or create machine learning models via clicking.
But end users also have ways to explore AI through Power BI, using Natural Language exploration. Examples of the types of things that end users or business analysts can leverage in Power BI include sentiment analysis, key-phrase extraction, optical character recognition and text translation.
Most of the AI capabilities Microsoft enables today still require a lot of leading-edge expertise, integration, development work and data science expertise. Yet it's clear that Microsoft is working rapidly to integrate those technologies all the way out to end-user-facing applications and will continue to push hard in that direction.
Posted by Scott Bekker on 12/05/2018 at 9:28 AM0 comments
Artificial intelligence (AI) will come into focus at the Live! 360 conference for Microsoft-focused developers and IT professionals this week in Orlando, Fla.
Live! 360 brings together Converge360's events for one combined conference with each event as a track. (Editor's note: Converge360 is the parent company of Redmondmag.com.) In addition to Visual Studio Live!, SQL Server Live!, TechMentor, Office & SharePoint Live! and ModernApps Live!, this year the conference is rolling out an entire Artificial Intelligence Live! track.
"We are excited about the AI Live launch and how that ties in nicely with our overall program of incubating new topics at Live! 360 and giving the Live! 360 attendees the opportunity to broaden their educational reach and knowledge base by attending any sessions across the six events," said Brent Sutton, vice president of Converge360 Events.
Headlining the AI track is a Tuesday morning keynote from Pranav Rastogi, a program manager at Microsoft who focuses on making developers successful with AI. His keynote is "Enabling Enterprise Developers in AI -- How Microsoft is Doing It." AI has been a huge messaging push for Microsoft over the last year and a half, and Rastogi is expected to talk about Microsoft technologies that support AI projects, as well as how Microsoft is using the approaches internally and in customer implementations.
Andrew Brust, conference co-chair for the Artificial Intelligence Live! track, as well as for the Visual Studio and SQL Server tracks, says the Live! 360 AI content will reflect the conference's roots in giving developers practical guidance.
"Most of the AI conferences out there are really like data science conferences. We will have that content, but not only that. Because it's VS Live!, we will have content for developers [about AI bots and features]," Brust said. "It's AI aimed at developers rather than AI aimed at AI specialists."
One example of the type of content that Live! 360 specializes in is being run by Brust, and will cover new AI features that Microsoft has just integrated into Power BI and how to make use of those capabilities. Another is a workshop by experienced BI expert Jen Stirrup on how BI professionals can transition into AI.
The main technology keynote for all conference tracks is on Wednesday, when Donovan Brown, the Principal DevOps Manager at Microsoft's Cloud Developer Advocacy Team, presents on "Enterprise Transformation." The talk will focus on the transition of Microsoft Visual Studio Team Services from a three-year waterfall delivery cycle to three-week iterations, open source elements and the Git Virtual File System.
Also Wednesday, James Montemagno, Microsoft Principal Program Manager in the Mobile Developer Tools unit, is scheduled to deliver an authoritative session on the future of .NET and Visual Studio.
Some of the other major technologies and themes being addressed by the more than 100 expert speakers this year include containers and the Azure Kubernetes Service, Azure Cosmos DB, PowerApps, Microsoft Flow, Windows Server 2019, Windows 10 updates, Microsoft Graph, Internet of Things (IoT) and Office 365 security.
Posted by Scott Bekker on 12/03/2018 at 9:33 AM0 comments
As the tides fall in the tech sector, Microsoft's market cap has emerged as the largest.
Back when tech stocks were on the upswing, Apple and Amazon both drove and benefited from the trend, reaching market caps over $1 trillion, with Microsoft and Alphabet close behind.
Now that the tech sector is falling along with markets overall, Microsoft is falling less quickly.
In midday trading Monday, Microsoft surpassed Apple as the most valuable company in the United States. Microsoft's market capitalization was $812 billion, about $1 billion higher than Apple's.
News has been rough for Apple over the last few weeks, with the stock losing nearly a quarter of its value since a September high on reports of drops in smartphone demand. Microsoft, on the other hand, continues to deliver on its pivot from a Windows-first to a cloud-first business.
Even though Microsoft seems to have regained supremacy from Apple on this one business measure (for the moment, at least), Microsoft stock is nearly 9 percent off its record high from early October.
Posted by Scott Bekker on 11/26/2018 at 11:55 AM0 comments
An acquisition this week layers some new resources onto Microsoft's already rich capabilities in the area of conversational AI.
Microsoft on Wednesday announced it had signed an agreement to acquire XOXCO, based in Austin, Texas. Like most of the dozen-plus acquisitions Microsoft makes each year, terms weren't disclosed, which usually indicates a fairly small company and a small team.
In a blog post about the deal, Lili Cheng, Microsoft corporate vice president for Conversational AI at Microsoft, described XOXCO as "a software product design and development studio known for its conversational AI and bot development capabilities." Cheng cited examples of XOXCO's previous work, including Howdy, a meeting scheduling bot for Slack; and Botkit, a set of development tools that is popular on GitHub.
Given Microsoft's sizable internal investments over the last few years on the digital personal assistant Cortana, the Microsoft Bot Framework, natural language processing and other artificial intelligence-related services, it's unclear from the brief blog post how much new capability XOXCO brings to the company. However, Cheng notes that Microsoft has partnered with XOXCO on projects over the last few years.
"We have shared goals to foster a community of startups and innovators, share best practices and continue to amplify our focus on conversational AI, as well as to develop tools for empowering people to create experiences that do more with speech and language," Cheng wrote.
One area that will be interesting to watch is how XOXCO plays into Microsoft's ongoing effort to push Teams as a competitor to Slack. The XOXCO Web site is currently replete with references to Slack, and a $1.5 million funding round three years ago was all about developing for Slack. As one of the early movers in the Slack commercial ecosystem, will XOXCO become a Microsoft effort to have a presence on that platform, or will the team's expertise be redirected to building bots, tools and add-ons for Teams exclusively?
Posted by Scott Bekker on 11/14/2018 at 12:03 PM0 comments
Symantec Corp. is buying its way into the business of defending Active Directory against reconnaissance attacks.
The Mountain View, Calif.-based security giant bought privately held Javelin Networks earlier this week for an undisclosed amount.
Founded in 2014, Javelin Networks has focused on giving organizations a tool to partially defend against the advanced persistent threat (APT) attacks that are largely attributed to sophisticated hacking groups or nation-state attackers. Examples of high-profile attacks in those categories include APT28, APT29 and DUQU 2.0.
Javelin Networks' organizing principle is that Active Directory is a popular and effective way for hackers to get information about corporate networks for lateral movement and privilege escalation, once they've found some other way into the network.
The company set forth its take on the problem with Active Directory in a whitepaper last year: "AD can't distinguish between a legitimate query and an illegitimate query. As long as the query came from an authenticated user, it has no choice but to answer, revealing the organization's biggest secrets. With a legitimate query, the attacker doesn't get just part of the organization's information -- they get all of it in a matter of seconds without any risk of being detected."
Javelin Networks contends that the vast majority of sophisticated attackers use Active Directory recon techniques once they're inside a network rather than noisier methods like network scans and protocol scans that tend to trigger alerts in security tools.
The company's main solution, AD|Protect, is designed to detect breaches autonomously, apparently based on profiles of the kinds of Active Directory recon that attackers typically try to carry out once they've gotten control of a domain-connected system. Javelin Networks uses Native Language Processing and other technologies to obfuscate the network in a way that is supposed to prevent the attacker from moving laterally. At the same time, the system kicks off forensics to help IT document and track the attack. Additionally, the tool probes for domain misconfigurations and persistence on an ongoing basis, an important feature given the months and years that APTs can remain hidden in a network.
Aside from the main, horizontal version of AD|Protect, Javelin Networks offered specialized versions for business services, critical infrastructure, energy, financial services, government, health care, information security and retail.
Other tools in the company's portfolio include an Active Directory breach and attack simulation product for finding misconfigurations and backdoors called AD|Assess, and some offerings designed for corporate penetration testers.
Because Active Directory recon efforts aren't standalone -- attackers need to be inside the network already to use the technique -- the acquisition by Symantec makes sense. The company will be bundling AD|Protect with other endpoint security products in its broad portfolio.
Symantec put the Javelin Networks team, which is split between offices in the United States and Israel, into its endpoint security business. The Javelin Networks tools will become part of Symantec's broader endpoint security stack.
Posted by Scott Bekker on 11/08/2018 at 3:22 PM0 comments
Security researchers at Kaspersky Labs this week provided an update on what personal digital data is worth on dark Web markets.
It's not a new idea; security researchers provide this data every few years. But it's always interesting to hear what data is going for. The upshot -- a consumers' entire digital life is worth less than $50.
David Jacoby, a senior security researcher at Kaspersky Lab, spent some time poking around sites where stolen user identities and accounts were on sale, and blogged about it here.
- The easiest information to find is hacked accounts, and they're not worth very much individually. "The price for these hacked accounts is very cheap, with most selling for about $1 per account, and if you buy in bulk, you'll get them even cheaper," Jacoby wrote.
- We're used to innovative business models from the digital crime community thanks to ransomware. Interesting schemes are emerging in the hacked account arena, as well. "Some vendors even give a lifetime warranty, so if one account stops working, you receive a new account for free," Jacoby noted, citing an example involving Netflix accounts.
- Dumpster diving is being used for purposes other than targeted, individual attacks. "People actually steal other people's mail and collect invoices, for example, which are then used to scam other people. They will collect and organize these invoices by industry and country. The vendors then sell these scans as part of a scammer toolbox," he found. "A scammer can use these scans to target victims in specific countries and even narrow their attacks down to gender, age and industry."
- Adding up various elements of a person's digital life, Jacoby estimated that hackers are selling people's complete digital life for less than $50, not including bank accounts, but often including services that might have a credit card attached.
- Online underground marketplaces were also trading in fake documentation, including fake ID cards, driver's licenses and passports. A registered Swedish passport, for example, was on sale for $4,000, Jacoby found.
The whole post is worth a read, but at the least it's a reminder, as if any were needed, of how ubiquitous hacked accounts are. There are so many out there that they're very cheap for other bad actors to purchase.
Posted by Scott Bekker on 11/07/2018 at 11:24 AM0 comments
Microsoft's collaborative deal with Walmart is shaping up to be a digital transformation laboratory.
The companies announced a five-year agreement in July that included enterprisewide use by Walmart of Microsoft Azure cloud services and Microsoft 365, the end user package that includes Office 365, Windows 10 and Enterprise Mobility + Security (EMS) functionality.
As one of the first steps in the agreement, the companies on Monday unveiled that they would be jointly staffing a "cloud factory," basically an expansion of Walmart's existing technology center in Austin, Texas (pictured), early next year. In all there will be 30 technologists in the office, which will include an undisclosed number of Microsoft engineers mixed in with the Walmart technology specialists.
The cloud factory's assignment includes a lot of the types of projects Microsoft has been routinely encouraging customers to undertake. In the lift-and-shift category, they'll be migrating thousands of internal Walmart business applications to Azure. The team will also be building new, cloud-native applications.
Beyond modernizing applications by putting them in Azure, the collaboration will include work on emerging technologies. For one thing, Walmart already has Internet of Things (IoT) sensors in a lot of locations.
Clay Johnson, Walmart executive vice president and enterprise chief information officer, said in a Q&A on Microsoft's site that they'll work with Microsoft to get data from existing and future sensors into Azure, where they can analyze it in new and different ways.
"With our IoT work and sensor enablement, we're looking at our energy consumption and other factors to predict equipment failures before they happen. Improving equipment performance can result in enhanced energy efficiency, which lowers costs and our carbon footprint," Johnson said. "Putting IoT data into edge analytics lets us look at data at a store level and backhaul it to Azure to look at it across a region or the whole U.S. We started talking to Microsoft about this concept of a set of stores being a 'micro-cloud,' and you roll them into Azure for data analytics and insights."
Artificial intelligence (AI), chatbots and natural language processing -- three more hot areas of digital transformation -- will also get tested at a massive scale in the Walmart environment, spearheaded by the Austin-based joint team.
Projects will include internal chatbots designed to help Walmart's 2.2 million employees navigate benefits, chatbots for managing supplier interactions and natural language processing of terabytes of unstructured text to improve business operations.
"Microsoft's going to get to see stuff at a scale they've never seen before," Johnson said of the Walmart environment. The retailer had $500 billion in revenues in fiscal 2018 and operates 11,200 stores worldwide. "I think they'll learn a lot from our footprint. Co-locating top engineers from both companies will deepen the technical brainpower for creating disruptive, large-scale enterprise solutions for Walmart."
Posted by Scott Bekker on 11/05/2018 at 12:52 PM0 comments
Originally, the game was to become the first company with a trillion-dollar market capitalization. Apple won that race in August, and Amazon came in second in September.
After a brutal October for stock markets and tech stocks especially, the next phase of the game appears to be who can stay a trillion-dollar company and for how long.
Apple was still barely hanging on to its trillion-dollar status at the end of the trading day Friday, after a disappointing earnings report Thursday afternoon that included a big miss on iPhone sales, a softer-than-expected forecast for the next quarter, and news that it would start hiding iPhone, Mac and iPad unit sales numbers in future earnings reports. Apple did dip below the trillion-dollar mark during trading Friday but recovered.
Amazon fell out of the exclusive trillion-dollar club weeks ago, and its market cap was around $815 billion on Friday.
All of which leaves Microsoft, sitting at $810 billion on Friday, very much in the top tier of highly valued tech companies with a shot at the trillion-dollar tier. Other contenders include Alphabet (Google) at $741 billion and Facebook at $434 billion.
Unlike the others, Microsoft has largely escaped the fake news, privacy and labor scandals of the last few months, and seems to be sinking on broad market trends rather than fundamentals or investor dissatisfaction.
We'll stay tuned.
Posted by Scott Bekker on 11/02/2018 at 2:22 PM0 comments
Samsung Electronics America Inc. on Friday began selling a Snapdragon-based 2-in-1 mobile PC with LTE support for always-on connectivity.
The Samsung Galaxy Book2 is now available for $1,000 at AT&T, Microsoft.com and Samsung.com. The device will be coming to Sprint and Verizon later this month.
"By building out our 2-in-1 portfolio and adding LTE connectivity, we're ensuring consumers can stay connected on the mobile computing device that's perfect for them to get more done and express their creativity," said Alanna Cotton, senior vice president and general manager at Samsung Electronics America, in a statement about the Galaxy Book2 and a related Chromebook device called the Samsung Chromebook Plus (LTE).
The Galaxy Book2 is being sold as a consumer device and ships with Windows 10 in S Mode, but Samsung is emphasizing its capability as a work device. The company is claiming up to 20 hours of battery life in S Mode.
Other features include a 12-inch display, a weight of 1.75 pounds, the Qualcomm Snapdragon 850 chipset, 4GB of memory, 128GB of storage, dual cameras, two USB-C ports, wireless support and an included pen and keyboard.
Posted by Scott Bekker on 11/02/2018 at 8:44 AM0 comments
IBM's plan to spend a whopping $34 billion to acquire Red Hat is all about the cloud. This big bet is partially a bet that the cloud gold rush isn't over, with IBM locked in a distant third place.
On Sunday, Big Blue unveiled its acquisition bid for Red Hat at a price that represented a 63 percent premium over Red Hat's share price.
Calling the acquisition a cloud market game-changer, IBM Chairman, President and CEO Ginni Rometty predicted that with the deal, "IBM will become the world's #1 hybrid cloud provider, offering companies the only open cloud solution that will unlock the full value of the cloud for their businesses."
With earnings season winding down, analysts at Synergy Research just detailed market share estimates for spending on cloud infrastructure services late last week. As with every other quarter for as long as the market has been tracked, Amazon Web Services (AWS) is the clear No. 1, with Microsoft at a distant but strong No. 2. According to Synergy, the market share numbers are AWS 34 percent, Microsoft 14 percent, IBM 7 percent, Google 7 percent and Alibaba 4 percent.
Rometty laid out IBM's view of the state of the cloud market, specifically that it's not too late for a big move. "Most companies today are only 20 percent along their cloud journey, renting compute power to cut costs," she said in the company's acquisition announcement. "The next 80 percent is about unlocking real business value and driving growth. This is the next chapter of the cloud. It requires shifting business applications to hybrid cloud, extracting more data and optimizing every part of the business, from supply chains to sales."
Such descriptions of the opportunity are similar to the "digital transformation" rhetoric coming out of Microsoft over the last year.
How Red Hat changes the game for IBM isn't completely clear.
Much of the current market share position has to do with the massive datacenter buildouts of the last decade. Red Hat has been an open source software provider for several of the public cloud players, rather than a front-running datacenter infrastructure player. And IBM says the company remains committed to building and enhancing the partnerships Red Hat has with major cloud providers, going on to cite existing arrangements with AWS, Microsoft Azure, Google Cloud and Alibaba.
"IBM is committed to being an authentic multi-cloud provider, and we will prioritize the use of Red Hat technology across multiple clouds," said Arvind Krishna, senior vice president of IBM Hybrid Cloud, in a statement. "In doing so, IBM will support open source technology wherever it runs, allowing it to scale significantly within commercial settings around the world."
That there is still room for innovation and competitive shakeups in the cloud is undoubtedly true. As Microsoft's and Google's ongoing investments show, none of the main cloud players is willing to cede the market to AWS. Whether IBM's acquisition of Red Hat changes its positioning in that market, however, remains to be seen.
Posted by Scott Bekker on 10/29/2018 at 11:34 AM0 comments
Microsoft beat Wall Street estimates for both revenues and earnings in the latest financial quarter, and its stock price was riding more than 5% higher at mid-day Thursday in the wake of the previous evening's report.
While financial analysts on the Microsoft earnings call Wednesday evening were positive overall about the quarterly results, they asked CEO Satya Nadella and CFO Amy Hood repeatedly about the growth rate for Microsoft Azure and came at the issue from many different angles.
Azure revenues are up 76% over the year-ago quarter. That's impressive, especially considering Microsoft is starting from a relatively high number as the No. 2 public cloud provider after Amazon Web Services (AWS).
Yet the number is slightly worrying to financial analysts, who note that the Azure growth rate figure has been marching steadily downward over the last few years. By comparison, the growth rate for the same quarter a year ago was 90%; two years ago, it was 116% for the quarter.
Nadella and Hood together conjured an extremely positive story about Azure's future built on three major elements.
One element is that Microsoft's hybrid approach to the cloud is not only a strategic advantage, but that analysts should be thinking about it as effectively hiding some Azure revenues. Microsoft's unique attribute, compared to major public cloud competitors AWS or Google Cloud Platform, is that the company has a huge installed base of on-premises server software customers. That legacy encouraged Microsoft to focus more on hybrid solutions that allow customers to move workloads to the cloud at their own pace and to integrate all kinds of services between the on-premises servers and the cloud platform. Over the last year, Microsoft has moved to bring its licensing model in line with that hybrid approach, especially via Azure hybrid benefits.
Hood made that case in an answer to one analyst. "I tend to focus...on the 'all up' server and product [Key Performance Indicator] because the Azure hybrid benefits that exist with Windows Server and SQL Server are really valuable to customers if they want to move to Azure on their own terms," Hood said. "If we start to focus on one number or the other, I think we're missing the fact that our customer method and go-to-market is actually through the overall product portfolio."
Nadella hammered the theme home in response to a different question. "We don't think of hybrid as some stopgap in a move to the cloud," he said. "[It's] not just the old workloads but most importantly for new workloads, and that's where we're seeing some very significant good feedback loops in shaping even our future roadmap. And this is a place where we are leading."
Another basic element of Microsoft's Azure story is one of driving cost out of the platform. Hood said commercial cloud gross margin percentage increased 4 points to 62% driven by significant improvement in Azure gross margin. Nadella added that the margin improvements are crossing Microsoft's product boundaries as more of the infrastructure is unified. "For the first time, what you see across Microsoft is really one platform, which spans all of these businesses and all of the margin structures that are there represented in it," he said.
The other element is Microsoft's rapid buildout of Azure services. Nadella enumerated some of what he described as 100 new Azure capabilities introduced in the previous quarter (mostly at Microsoft Ignite), including Azure Confidential Computing, Azure Sphere and Azure Digital Twins. He pointed out that new and higher-level services should generate higher margins over time.
That three-part case -- that hybrid is a competitive strength that partially obscures some Azure revenues, that margins are improving and that rapid innovation around Azure capabilities promises more upside -- seemed for now to have satisfied investors, given the pop in Microsoft share prices.
Posted by Scott Bekker on 10/25/2018 at 12:00 PM0 comments
Most of the headlines Bill Gates makes these days relate to his philanthropic work with the Bill and Melinda Gates Foundation or to his default role as a public intellectual. He's frequently quoted on topics ranging from technology trends to global health to economics to environmental issues to his current reading list.
Rarely any more is Gates deployed by Microsoft as an intentional public spokesman. Yet the Microsoft co-founder is still deeply involved in some operations at the company.
When Satya Nadella took over as CEO in early 2014, Microsoft made a point to communicate that Gates would be getting more involved on a week-to-week basis with the company than he had been during the latter part of Steve Ballmer's tenure as CEO.
"I'm thrilled that Satya has asked me to step up, substantially increasing the time that I spend at the company," Gates said in a welcome video accompanying Nadella's promotion to CEO. "I'll have over a third of my time available to meet with product groups, and it will be fun to define this next round of products, working together."
At the time, Gates' technical adviser role was widely viewed as a critical step to reassure investors that Nadella, who was less well-known on Wall Street than he was in Silicon Valley, would be able to handle the CEO job. Adding to the potential for investor skittishness was the simultaneous move by Microsoft to lower Gates' profile on the Microsoft board of directors by having him trade the chairman's role for a regular seat.
Most of those concerns have evaporated as Microsoft stayed near the forefront of a historic run in tech stocks over the last few years. Microsoft's stock value has roughly tripled on Nadella's watch. "He's done a good job of repositioning the company in investors' minds," Ballmer said of Nadella during an interview with Bloomberg in July.
Consequently, there's been a lot less attention paid to Gates' role at Microsoft. Nonetheless, in a pre-recorded Wired video segment about key moments in Gates' life and career that was posted last week, Gates confirmed that he's still putting in time with engineers and technical strategists in Redmond.
"Even to this day I do some architecture things on the various products," Gates said during the segment.
Posted by Scott Bekker on 10/22/2018 at 10:18 AM0 comments
Paul Allen went on to do many significant things in his life, but the achievement that provided the springboard for so many of the rest of his activities was the fortune he amassed as the co-founder of Microsoft.
Primarily, Microsoft is associated with the other co-founder, Bill Gates, whose personality, drive and talents formed the company's identity, and who remains involved in the company's direction on a part-time basis.
On the other hand, Allen, who died of complications of non-Hodgkin's lymphoma at age 65 this week, has been out of day-to-day activity at Microsoft since 1983, and off the board since 1986. His time at the company ended before Windows became a dominant product, before the Internet emerged as an opportunity for the tech industry and a threat to Microsoft's central position in PC computing, before the public ugliness of the antitrust case, before Microsoft's rise as a major enterprise software player and before the company emerged as one of the handful of cloud megavendors.
That said, the 43-year-old company still bears a few important markers left by Allen himself.
One is the name. Calling the company "Micro-Soft" was Allen's idea. The hyphen was later dropped, but four decades later, the company still goes by a name reminiscent of a different era in tech. In 1975, "microcomputers" and "micro" were sexy terms in computing. Software remains an element of the name, as well, even in an age in which Microsoft has become more about the cloud and hardware has also become a significant piece of the business.
A bigger legacy of Allen's is his role as a catalyst for getting the slightly younger Gates focused full-time on the computer business. They spent countless hours together at the private Lakeside School in Seattle working on a teletype terminal connected over a phone line to a time-share computer. In the small scrum of like-minded Lakeside students spurring on each other's enthusiasms for the technical possibilities of the systems, Gates and Allen were especially close.
Living in Boston a few years later, Allen grabbed his friend Gates from his college dorm at Harvard to show him the January 1975 issue of Popular Electronics with its Altair 8800 on the cover. "This is happening without us!" Gates recalls Allen declaring in a successful effort to rally Gates to prioritize seizing the moment over getting a college degree.
"Microsoft would never have happened without Paul," Gates said in a statement earlier this week. Counterfactuals are difficult to prove. It's hard to imagine that given his interests, Gates would not have seen the magazine himself or seized the moment in some other way. But without the timing enabled by the personal history and chemistry between those two individuals, who knows?
Another Allen imprint on Microsoft's culture is his role in one of the most famous coding death marches in technology. After spotting that article about the Altair, Gates and Allen called the maker of the device and told him they were essentially finished with a version of Basic for it. They hadn't started.
They spent the next few weeks working around the clock. Allen, who was to do the demo, realized on the plane to Albuquerque that they hadn't written a loader, a requirement for their demo, and whipped one up on the plane. At the time, they were calling their company "Traf-o-Data," but the core of Microsoft was there. As Stephen Manes and Paul Andrew wrote in their biography of Gates, "The development tools Allen put together in this era would serve as the core of Microsoft's language efforts for years."
Finally, what Allen realized was "happening without" them was the democratization of computing that drove Microsoft's growth strategy -- enabling a PC on every desktop -- for most of its corporate history.
Outside of Microsoft, Allen lived the dreams enabled by an early fortune for a technology titan. He invested in successful commercial space flight ventures, bought professional sports franchises, commissioned massive yachts, founded newsworthy companies and backed scientific research projects. Ultimately, he may be more widely remembered for his role in sports or those other activities, but whatever Microsoft would have been called without Paul Allen, it certainly would have been a different company.
Posted by Scott Bekker on 10/18/2018 at 10:55 AM0 comments
The Microsoft Surface is becoming a factor in the U.S. PC market, according to market researchers.
Gartner this week published its preliminary quarterly results for PC unit shipments in the third quarter of 2018.
On the U.S. list, Microsoft ranked fifth for the quarter, with a 4.1 percent share of the market.
It's a long way from No. 5 to No. 4. According to Gartner, Microsoft shipped 602,000 units in the quarter. Apple, by contrast, moved more than 2 million units in the quarter for fourth place and a 13.7 percent share. The top vendor for the quarter, HP Inc., sold more than 4.5 million units and held a 30.7 percent share in the U.S. market.
In terms of movement, Microsoft is heading in the right direction, with 1.9 percent year-over-year growth for the quarter, a period when Apple dropped by 7.6 percent.
Lenovo, powered by its joint venture with Fujitsu, vaulted 22 percent in the United States, and a 10.7 percent gain worldwide put the computer maker in first place on Gartner's global list for the quarter. Dell rounds out the U.S. top five in second place.
The backdrop is a relatively flat U.S. PC market, where overall shipments dropped 0.4 percent, with business PC demand nearly offsetting declines in mobile PC shipments. Because Gartner defines Google Chromebooks as outside the PC market, its figures don't include the double-digit growth of those devices in the United States.
Across all Windows PCs, not just Surface devices, Gartner expects business demand to remain strong. Referring to the global market, analyst Mikako Kitagawa said in a statement, "The PC market continued to be driven by steady corporate PC demand, which was driven by Windows 10 PC hardware upgrades. We expect the Windows 10 upgrade cycle to continue through 2020 at which point the upgrade demand will diminish."
Extended support for Windows 7 formally ends on Jan. 14, 2020. That operating system retained a narrow lead over Windows 10 for the most popular operating system in September, according to Net Applications. That measure and recent Microsoft statements suggesting enterprises were about halfway through their Windows 10 migrations both indicate that there is a lot of remaining potential for Windows 10-based hardware upgrades.
Posted by Scott Bekker on 10/12/2018 at 11:14 AM0 comments
Just four days after releasing the latest version of Windows 10 and other "version 1809" operating systems, Microsoft is issuing a recall.
"We have paused the rollout of the Windows 10 October 2018 Update (version 1809) for all users as we investigate isolated reports of users missing some files after updating," Microsoft said this weekend in a statement on its support site. As of Monday afternoon, Microsoft had not released a new update.
Affected platforms included Windows 10, version 1809; Windows Server, version 1809; Windows IoT Core, version 1809; Windows 10 Enterprise LTSC 2019; Windows 10 IoT Enterprise LTSC 2019; and Windows Server 2019.
Microsoft was advising customers who lost files to contact the company immediately. Meanwhile, the company warned anyone who had manually downloaded the installation media not to install it.
Despite the emphasis on the phrase "isolated reports," the decision to initiate such a major disruption by interrupting downloads indicates Microsoft has serious concerns about the code that's been released. There is always a warning to back up files before initiating an update, but a file deletion issue ranks among the most serious types of problems that an upgrade can introduce.
Pulling back the update is another black eye on the major overhaul of the update process for the Windows 10 era, which has been marked by a generally popular operating system but frustration over lack of control over updates and concerns about the speed of release cycles and testing quality issues.
The previous Windows 10 update, April 2018, ran into delays and post-release problems, and the patch release process is also taking serious criticism from patching experts, including Microsoft Most Valuable Professional (MVP) and moderator of the Patchmanagement.org listserve Susan Bradley.
Nobody is saying that quality control of a Windows operating system releases is easy. Microsoft currently claims a 700-million-user installed base for the operating system, and that OS runs on hundreds of hardware configurations and with thousands of other software applications and cloud services.
This latest incident should suggest to Microsoft that it's time to swing the pendulum back a little from the pressures in technology to "move fast and break things" toward being more deliberate, cautious and exhaustive in the pre-release process for Windows.
Posted by Scott Bekker on 10/08/2018 at 10:15 AM0 comments
Microsoft on Tuesday refreshed three devices in the Surface hardware line, teased Surface-branded headphones that integrate with Cortana, and revealed a new consumer financing program for Surface buyers.
The new devices unveiled during the New York City media event included a Surface Pro 6 (pictured above on the left), Surface Laptop 2 (on the right) and Surface Studio 2. Microsoft began taking orders for all three devices on Tuesday. The Surface Pro 6 and Surface Laptop 2 will start shipping on Oct. 16. The Surface Studio 2 starts shipping Nov. 15.
The enhancements rolling out for the various Surfaces were mostly of the speeds, feeds and color variety. A new black option is coming to the Surface Pro 6, which is also available in platinum, and to the Surface Laptop 2, which is also available in platinum, burgundy and cobalt blue.
Major enhancements to the Surface Pro 6 include an upgrade to 8th Generation Intel processors, which bring quad-core processors to the 2-in-1 category for a 67 percent performance bump; a 13.5-hour battery life and an 8MP auto-focus camera, according to Microsoft executives. Prices for the new Surface Pro 6 range from $899 to $2,299 depending on the choices of 8GB or 16GB of memory, an Intel Core i5 or Core i7 processor, and storage options of 128GB, 256GB, 512GB or 1TB. Prices don't include the Surface Keyboard, so buyers are looking at a minimum of an additional $100 to use the device as intended.
For the Surface Laptop 2, Microsoft's more traditional clamshell laptop, the processor upgrade to Intel's 8th Generation leads to an 85 percent performance bump over previous models, while battery life clocks in at 14.5 hours, according to Microsoft. Prices on the Microsoft Store site start at $999 for 8GB of RAM, an Intel Core i5 and 128GB of storage. The top-end configuration of 16GB of RAM, Intel Core i7 and 1TB of storage costs $2,699.
The Surface Studio 2 sports the same gargantuan 28-inch screen as the previous version, but moves two generations forward in processors to Intel 8th Generation for a 50 percent performance bump, and features more brightness and contrast in the display. The high-powered system for creative professionals is available in three configurations. The entry level has 1TB of storage with 16GB of RAM at $3,499. The top-of-the-line model costs $4,788 for 2TB of storage and 32GB of RAM.
The main teaser from the Tuesday event was Surface Headphones, which aren't available to order yet and are expected to ship by the holidays.
As to why Microsoft needs an offering in the crowded premium headphone category, Microsoft product design guru Panos Panay said, "We built Surface Headphones to complete the Surface experience." The $349 price tag will deliver 13 levels of ambient noise control, 8 "beam-forming" microphones for phone calls and voice commands, and PC-specific features like voice activation of the Cortana digital assistant and automatic pausing of video when the headphones are removed.
Also Tuesday, Microsoft unveiled a new way for consumers to pay for Surface devices, which tend to carry a premium price tag compared to OEMs' Windows laptops and 2-in-1s. Called Surface All Access, the monthly pricing option includes a Surface device and an Office 365 subscription with a no-interest 24-month payment plan. The financing is provided by WebBank and is administered through Dell Preferred Account, rather than through Microsoft Financing.
Starting monthly bundle prices included a Surface Go Bundle for $24.99, Surface Laptop Bundle for $46.63, Surface Pro Bundle for $47.87, Surface Book 2 Bundle for $54.96 and Surface Studio Bundle for $150.79. The program, which begins on Oct. 16, comes about a month after Microsoft ended new enrollments into its consumer-focused Surface Plus Program.
Posted by Scott Bekker on 10/02/2018 at 4:51 PM0 comments
Just in time for the expected unveiling of some new Surface hardware on Tuesday, the current generation of Microsoft Surface devices got an "all clear" from the editors of Consumer Reports.
Microsoft's Surface lineup suffered a black eye last August when the nonprofit consumer research agency revoked the "recommended" designation from several Surface devices, including the fifth-generation Surface Pro, the Surface Book in 128GB and 512GB editions, and the Surface Laptop in 128GB and 256GB versions.
While Consumer Reports reviewers were generally impressed with the lab performance of the devices at the time, their consumer reliability surveys led the outlet to estimate that 25 percent of Microsoft laptops and tablets would experience problems by the end of the second year of ownership. Survey respondents reported startup issues, unexpected freezing, surprise shutdowns and insufficiently responsive touch screens.
With a new batch of annual surveys in hand, Consumer Reports last Thursday declared the Surface lineup eligible for "recommended" status and restored recommended ratings to the Surface Pro, Surface Book and Surface Laptop.
"Microsoft's reliability is now on par with most other laptop brands," said Martin Lachter, senior research associate at Consumer Reports, in a statement.
One existing Surface device isn't being recommended, however. The Surface Go is Microsoft's new 10-inch, 2-in-1 detachable that began shipping in August. Consumer Reports tested the $400 model with 64GB of storage and 4GB of memory and the $550 model with 128GB of storage and 8GB of RAM. Neither was recommended for the reason that most 10- and 11-inch laptops struggle in the Consumer Reports testing regimen, which weighs processing power heavily.
While Microsoft released a dissenting statement last August when its Surface lineup was removed from the recommended list, the company did not comment on the Surface Go results.
Microsoft is expected to have plenty to say about the Surface lineup in New York on Tuesday at 4 p.m. EST, when it's holding a media event. Cryptic invites asked technology journalists only for a "moment of your time." However, observers of Microsoft's consumer hardware are anticipating several Surface, Windows 10 and possibly HoloLens announcements during the event.
Posted by Scott Bekker on 10/01/2018 at 1:21 PM0 comments
SQL Server 2019, Azure Sphere, in-use encryption in Azure and a major Azure IoT initiative called "digital twin" headlined a bounty of new technology previews from Microsoft for customers and partners to test and evaluate.
The batch of preview releases came during the Microsoft Ignite conference this week in Orlando, Fla. While the preview technologies aren't yet supported or touted as production-ready, their delivery marks the key milestone when a product goes from slideware to something concrete. (For coverage of technologies hitting general availability at Ignite, see this roundup.)
SQL Server 2019
The highest-profile public preview at Ignite is SQL Server 2019, the latest release of one of Microsoft's most significant server platform products. The preview is classified as a community technology preview (CTP). Getting the most attention in the new version of the database server is SQL Server Big Data clusters. Other enhancements include database performance enhancements, encryption improvements to protect data in use and significant indexing improvements.
See related coverage for details of the new features and a Q&A about SQL Server Big Data clusters here and here.
Big Azure Advances
A key Microsoft initiative for securing the Internet of Things (IoT) also reached the public preview stage. Microsoft first unveiled Azure Sphere in April, but at Ignite, Microsoft announced the preview for the solution, which includes a microcontroller unit known as the Azure Sphere MCU, a Linux-based Azure Sphere OS and an Azure Sphere Security Service. An Azure Sphere development kit is immediately available for prototyping from seeed for $85. Inspired by the explosion of IoT devices, their increasing connectivity to the Internet and other networks, and the emergence of IoT botnets like Mirai, Azure Sphere is Microsoft's attempt to be a principal provider of a framework for helping device makers and users secure IoT.
Another significant security enhancement effort in Azure is a public preview coming on Monday, Oct. 1, for an encryption-in-use solution called Azure confidential computing. Data is commonly encrypted at rest (by encrypting the files or disks on which it is stored) and in motion (via protocols such as HTTPS, SSL, TLS and FTPS). The trickier challenge is to protect data during processing, known as encryption in use. Microsoft at Ignite announced the public preview of a new Azure virtual machine family, which it calls the DC series and is based on Intel SGX technology.
Another public preview that is less than a month away is Azure Digital Twins, which is expected on Oct. 15. The big idea is a service for creating a virtual representation of a physical environment. Part of the IoT platform, an Azure Digital Twin will give partners and customers a platform to create comprehensive digital models and spatially aware solutions, Bert Van Hoof, partner group manager for Azure IoT, explained in a blog post: "Most IoT projects today start from a things-centric approach, but we've flipped that around. We've found that customers realize huge benefits by first modeling the physical environment and then connecting (existing or new) devices to that model."
The concept has been associated with industrial equipment, such as machines and engines, but Microsoft's new vision has more to do with creating smarter spaces out of offices, schools, hospitals, banks, stadiums, warehouses, factories, parking lots, streets, intersections, parks and plazas.
Azure Stack, Microsoft's private cloud version of its Azure public cloud that is sold by certified hardware partners, will become more container-friendly with a public preview of Kubernetes support. Kubernetes is the popular, open-source system for automated deployment, scaling and management of applications in containers. Azure Stack customers will now be able to install Kubernetes using Azure Resource Manager templates.
Microsoft declared a handful of other technologies as public previews at Ignite.
- Ad-hoc data exploration is the focus of Azure Data Explorer, a speed-optimized indexing and querying service for analyzing event data from apps, servers and edge devices.
- A preview will be available Oct. 1 for Azure SQL Database Hyperscale for single databases, with an auto-scaling capacity of a whopping 100TB per database.
- Seven months after making Azure DataBricks generally available, Microsoft unveiled several updates to the Apache Spark-based analytics platform for building collaborative Big Data and artificial intelligence solutions. Among those is a preview of Azure DataBricks Delta, a transactional storage layer atop Spark to improve data consistency and read access.
- A new Azure Managed Disks offering known as Azure Ultra SSD Managed Disks is designed for latency-sensitive workloads through the use of solid state drives.
- Azure Files is being updated with a high-performance, SSD-backed storage tier known as Azure Premium Files.
- Storage capacity for Azure Managed Disks is being expanded in certain regions to now cover 8, 16 and 32TB capacities. Those storage sizes will apply to Premium SSD, Standard SSD and Standard HDD.
- An application acceleration platform used internally for Bing, Office 365 and Xbox will be available as the Azure Front Door Service (AFD). It is designed for delivery, control and monitoring of global microservice-based Web applications.
- A solution for adding governance capabilities called Azure Blueprints is available in preview and will be included in the Azure platform at no additional cost once it is generally available.
- The Azure Resource Graph brings the ability to explore Azure resources through the Azure Portal, PowerShell or CLI for efficient inventory management.
- Supporting DevOps teams efforts to ship more quickly with better compliance and auditing is a new Azure Policy tool, with additional features that are unlocked when it's used in combination with Azure App Insights and Azure Monitoring.
- The constant imperative to predict, monitor and contain Azure usage costs gets a native tool as Microsoft previews Azure Cost Management as a component of the Azure Portal for Enterprise Agreement customers. The capabilities were previously available as a standalone platform from Cloudyn.
Posted by Scott Bekker on 09/27/2018 at 8:44 AM0 comments
A slew of products and technologies are advancing into the general availability (GA) stage this week at Microsoft Ignite, including the latest version of Windows Server, strategic Internet of Things (IoT) offerings and a number of Azure services and features.
Kicking off its flagship IT conference Monday morning with a keynote by CEO Satya Nadella, Microsoft released details on more than a dozen major products, services and frameworks. While Microsoft's product cycle has kept pace with industry trends to become more fluid in recent years, GA is still a key milestone that signals a product will be fully supported in production environments.
Windows Server 2019
Headlining the announcements of products reaching GA at Ignite is Windows Server 2019. The direction for Microsoft is clearly to the cloud, and the company ceaselessly encourages customers to move workloads to the Azure public cloud or to purpose-built cloud applications, such as Office 365 or Dynamics 365. Yet, Microsoft has carved a competitive niche for itself in the cloud out of its hybrid capabilities, and promises to continue to make and support software that runs on-premises for a while longer.
In strict terms, what Microsoft is announcing for Windows Server 2019 is an October GA, making real availability at least a week off. In technical terms, Windows Server 2019 is a Long-Term Servicing Channel release. That means it is compiled to include features from previous Semi-Annual Channel release versions 1709 and 1803, and that it will have five years of mainstream support and five years of extended support.
A simultaneous Semi-Annual Channel release of Windows Server, version 1809, will also be made available. Microsoft is emphasizing a focus in that version on containers and micro-services.
Internet of Things
One of Microsoft's most significant IoT initiatives also hit GA during Ignite. Azure IoT Central is intended to democratize IoT by making it accessible to users and organizations that don't want to set up the back-end infrastructure or perform the integration necessary to support and leverage an army of sensors and other devices in the field.
The solution differs from other Azure IoT solutions in that it is a Software as a Service (SaaS) offering rather than a Platform as a Service (PaaS) offering. Companies subscribe to Azure IoT Central on a per user per month model. The service gives each device a unique security key, provides device libraries, supports common connectivity protocols, scales to millions of connected devices or millions of events per second, and offers time-series storage. The technology had been in public preview since December.
A more tactical GA release this week in the IoT arena is new capabilities in the Azure IoT Hub Device Provisioning Service. Overall, that service allows customers to provision, register and scale IoT devices. The new capabilities will deliver more control to customers over their IoT solutions through the ability to reprovision devices from one IoT solution to another and through enrollment-level allocation rules.
Major Azure Services
Three major services that change the product mix for the Azure public cloud reached GA at Ignite.
Microsoft is throwing its hat in the firewall-as-a-service ring with Azure Firewall. Hitting GA after a three-month public preview phase, Azure Firewall brings native security controls to the Microsoft public cloud. Microsoft describes it as "a managed, cloud native network security service to protect application resources with built-in high availability and unrestricted cloud scalability." The security control includes central administration and logging across subscriptions and virtual networks.
Another major Azure service reaching GA after a short public preview is Azure Virtual WAN. The cloud networking service is designed to allow organizations to provide branch-to-branch connectivity through Azure in an optimized and automated way. Customers can configure branch devices manually to connect to Azure Virtual WAN, or they can work with partners who automate the process. Microsoft lists a number of preferred partners already, including Barracuda Networks, Check Point, Citrix, NetFoundry, Palo Alto Networks, Riverbed Technology and 128 Technology.
Monitoring Azure services is an evolving task with an ever-broadening scope, and Microsoft made a number of enhancements GA to Azure Monitor. With the changes, Azure Monitor becomes the central location for monitoring infrastructure, apps and networks on Azure. The main change is integrating Azure Log Analytics and Azure Application Insights into Azure Monitor as features, rather than as separate services. While maintaining the full functionality of those deep application monitoring and deep infrastructure monitoring tools, Microsoft is making them available from within the Monitor interface.
Other Azure Enhancements
Eight other significant Azure services and features also hit the GA stage at Ignite.
- A new Speech Service takes improved versions of several of Microsoft's AI speech capabilities and combines them into a single service. Included are speech recognition, speech translation capabilities and customization capabilities to create a unique voice.
- A version 4 SDK for the Bot Framework, with ease-of-use and pick-and-choose enhancements to make it faster and simpler for first-time bot creators.
- The Azure Cosmos DB has three significant updates that are GA -- multi-master support for high-availability and lower latency, the Cassandra API that makes Cosmos DB multi-model and multi-API, and the Reserved Capacity subscription option.
- The size limit for Azure Files shares is being expanded dramatically from 5TB to 100TB, a move Microsoft positions as enabling more flexible migrations of on-premises data files to the cloud.
- Azure Maps are updated with an improved Map Control API, which adds enhancements related to data layering, visualization, HTML-based icons and a new spatial math library.
- The Azure Standard SSD Managed Disks offering will give customers running Web servers and lightly used servers better performance with this SSD (Solid State Drives) offering than they would get from Azure's offerings featuring Hard Disk Drives (HDD).
- The Azure Serial Console is a tool to help developers and system administrators conduct self-service diagnosis and troubleshooting on virtual machines even when the VM is unreachable.
- The Azure SignalR Service builds on the SignalR ASP.NET library for adding real-time functionality, like chat or stock tickers, to Web applications by providing a back-end service that handles tasks like capacity provisioning, scaling and persistent connections.
Posted by Scott Bekker on 09/24/2018 at 2:39 PM0 comments
A challenge with artificial intelligence (AI) and mixed reality is often figuring out how to apply the somewhat futuristic concepts to everyday business problems. Microsoft is trying to move that process along within the context of its business applications suite, Dynamics 365.
On Tuesday, Microsoft showcased five new apps for Dynamics 365 that embed within everyday business tasks either the AI capabilities that increasingly permeate the Azure cloud or the mixed reality promise of the Microsoft HoloLens visor-based computer or Windows Mixed Reality immersive headsets from OEMs.
The demonstrations came as a teaser for the Microsoft Ignite and Microsoft Envision conferences next week in Orlando, Fla., where Microsoft will address IT and business applications audiences and unveil new products, features and services across its many platforms.
In a briefing for press and analysts, Alysa Taylor, corporate vice president of Business Applications & Industry Marketing at Microsoft, unveiled three AI modules that will be released as previews this fall: Dynamics 365 AI for Sales, Dynamics 365 AI for Customer Service and Dynamics 365 AI for Market Insights.
Taylor described the three products as "a new class of AI applications that will deliver out-of-the-box insights by unifying data and infusing it with advanced intelligence to guide decisions and empower organizations to take informed actions."
Dynamics 365 AI for Sales will be designed for both salespeople and their managers, providing next-step suggestions for the sales team and coaching recommendations based on pipeline analysis for their managers.
The Customer Service module will use natural language recognition and AI to both guide customer service employees and to provide virtual agents that can handle basic tasks and lower support costs. Again, the emphasis, Taylor said, is to deliver those benefits "without needing in-house AI experts and without writing any code."
The Market Insights module for Dynamics 365 is aimed at surfacing Web and social insights to improve the performance of marketing, social media and market research teams.
Also on Tuesday, Microsoft officials showed off two mixed reality modules for Dynamics 365 that have been previously discussed as part of the October 2018 release of Dynamics 365, which will be generally available on Oct. 1.
Microsoft Dynamics 365 Remote Assist (pictured above) provides for a new type of customer support that takes advantage of the hands-free nature of a HoloLens headset. The person needing assistance can wear the HoloLens, which both streams video to a remote support worker and allows the support worker to project spatial directions or diagrams onto the person's display. In technical terms, the solution consists of heads-up, hands-free video calling, image sharing and mixed reality annotations.
Microsoft Dynamics 365 Layout immerses planners and their customers in a 3-D version of potential room designs, floor plans or entire building configurations that help them collaborate on the finished setup. Leveraging various pieces of the Microsoft stack, including HoloLens or immersive goggles, the Layout module includes capabilities for scanning real spaces, loading new virtual layouts to overlay the physical version and the ability to view the results in mixed reality or via streaming to other screens.
Posted by Scott Bekker on 09/18/2018 at 2:13 PM0 comments
A new Microsoft service aims to fix the compatibility issues that tend to come up for organizations when they migrate applications to Office 365 and Windows 10.
Dubbed Desktop App Assure, the tool will be delivered as a component of FastTrack, Microsoft's internal migration desk for moving customers to its cloud platforms.
Microsoft announced Desktop App Assure on Wednesday in a blog post, with more details to come at the Microsoft Ignite show later this month. A North American preview of Desktop App Assure will start on Oct. 1, and worldwide availability is set for Feb. 1, 2019.
"Desktop App Assure operationalizes our Windows 10 and Office 365 ProPlus compatibility promise: We've got your back on app compatibility and are committed to removing it entirely as a blocker," wrote Jared Spataro, corporate vice president for Office and Windows Marketing, in the post.
The service is designed to overcome concerns about app compatibility, which Spataro characterizes as disproportionate to the statistics Microsoft sees in customer diagnostic data. His post contends that 99 percent of apps are compatible with new Windows updates, and that apps that work on Windows 7 will generally continue to work on Windows 10 and subsequent feature updates.
"But if you find any app compatibility issues after a Windows 10 or Office 365 ProPlus update, Desktop App Assure is designed to help you get a fix," he said.
The program works on a service desk model. Customers who experience a problem file a ticket through FastTrack and receive follow-up from a Microsoft engineer.
Desktop App Assure will be included for customers of Windows 10 Enterprise and Windows 10 Education.
Posted by Scott Bekker on 09/06/2018 at 3:07 PM0 comments
If you're waiting longer to refresh the PCs in your organization, you're not alone.
According to new forecasts released this week by IDC, lengthy refresh cycles are contributing to a gloomy outlook for personal computing devices.
At this point in the year, IDC is predicting that total unit shipments of traditional PCs, tablets and workstations will decline by 3.9 percent in 2018. The compound annual growth rate (CAGR), if growth is the right term, is -1.5 percent for the next five years.
Shipments for this year are expected to hit 407 million devices. By 2022, IDC is lowering that figure to 383 million devices.
Among the sub-sectors, commercial-focused PCs are a critical laggard. "Desktop PCs are expected to see a CAGR of -2.7% as most of these devices are destined for the commercial market where lengthy refresh cycles and saturation are contributing to a steady decline in shipments," the Framingham, Mass.-based market research firm said in a statement.
Some other sectors are predicted to do even worse over the next five years. Slate tablets have a five-year CAGR of -5.3 percent. Traditional notebooks and mobile workstations are expected to bomb even more, with a five-year CAGR of -9.1 percent.
Those organizations that are refreshing their PC inventory over the next five years are expected to put more of their money into two distinct categories.
"While the ramp of convertibles and detachables has been more crawl than run, the category on the whole continues to build momentum," IDC researcher Linn Huang said in a statement. The ultraslim notebook category is expected to have a CAGR of 7.8 percent, while the 2-in-1 device category has a projected CAGR of 9.3 percent, according to IDC.
Posted by Scott Bekker on 08/31/2018 at 11:24 AM0 comments
Enterprise Windows administrators worldwide can relate to cold-sweat-down-the-back moment that's detailed in Wired's new chronicle of the NotPetya attack last summer.
"The Untold Story of NotPetya, the Most Devastating Cyberattack in History," by Andy Greenberg, focuses on the apparently collateral damage to the world's largest shipping conglomerate, A.P. Møller-Maersk, when NotPetya hit last summer.
Posing as a piece of ransomware, NotPetya was actually spreading extremely quickly and encrypting systems' master boot records, rendering them unusable and unrecoverable. Conventional wisdom is that Russia designed the malware to attack Ukraine, but NotPetya brought Maersk's global operations to a halt and cost the giant $250 million to $300 million or more.
Deep in the piece, Greenberg reports on Maersk's NotPetya-related trouble with domain controllers:
Early in the operation, the IT staffers rebuilding Maersk's network came to a sickening realization. They had located backups of almost all of Maersk's individual servers, dating from between three and seven days prior to NotPetya's onset. But no one could find a backup for one crucial layer of the company's network: its domain controllers, the servers that function as a detailed map of Maersk's network and set the basic rules that determine which users are allowed access to which systems.
Maersk's 150 or so domain controllers were programmed to sync their data with one another, so that, in theory, any of them could function as a backup for all the others. But that decentralized backup strategy hadn't accounted for one scenario: where every domain controller is wiped simultaneously. "If we can't recover our domain controllers," a Maersk IT staffer remembers thinking, "we can't recover anything."
Salvation came in the form of a power outage. Frantic calls went out from the recovery operations center near London to hundreds of IT admins in datacenters worldwide.
Maersk's desperate administrators finally found one lone surviving domain controller in a remote office -- in Ghana. At some point before NotPetya struck, a blackout had knocked the Ghanaian machine offline, and the computer remained disconnected from the network. It thus contained the singular known copy of the company's domain controller data left untouched by the malware -- all thanks to a power outage.
Sometimes what seems like bad luck -- say, a power outage knocking down your domain controller -- turns out to be the luckiest thing in the world.
Posted by Scott Bekker on 08/27/2018 at 10:03 AM0 comments
The long-promised public play date between Alexa and Cortana is here.
The digital voice assistants' corporate parents, Amazon and Microsoft, first unveiled that they were working on getting Alexa and Cortana together a year ago.
It took a little longer than the companies originally anticipated. An end of 2017 planned release came and went without comment, but Amazon Alexa and Microsoft Cortana did appear together on stage for a successful demo at the Microsoft Build conference in May.
Officially, the integration that was announced on Wednesday is a "public preview," although what that term means anymore in this age of cloud services telemetry and constant feature upgrades is up for debate. Essentially, the integration is here.
Practically, the public preview has some limitations that will be expanded upon over time, in addition to the usual adding of new services and skills. It's only available for U.S. customers to start, and won't support music streaming, audio books, flash briefings or setting alarms. The public preview version will also query users about their experience with the integration.
What U.S. customers will be able to do during the public preview is call up Cortana from Echo devices or call up Alexa from Windows 10 PCs and Harman Kardon Invoke speakers. Each digital voice assistant is enabled as a skill on the other's platform.
To start off on an Echo device, users need to say, "Alexa, open Cortana." That command will bring up instructions and a requirement to sign into the Microsoft account that includes Cortana. Conversely, from a Windows 10 PC, a user will say, "Hey, Cortana, open Alexa," and follow the prompts on screen to sign into Alexa.
According to Amazon's announcement blog post, from Cortana within Alexa, users can say things like:
- "What new e-mails do I have?"
- "What's on my calendar for tomorrow?"
- "Add 'order flowers' to my to-do list."
While going from Alexa within Cortana, a user can say:
- "Turn on the lights."
- "Play Jeopardy."
- "What's my order status?"
- "Add milk to my shopping list."
Posted by Scott Bekker on 08/15/2018 at 9:59 AM0 comments
Microsoft is consolidating a number of ports for Azure Stack in a move that should significantly reduce the hybrid cloud platform's attack surface and simplify network integration.
Starting with a forthcoming release, Microsoft will collapse port requirements for various Azure services running on Azure Stack from 27 different ports to just one. The services will communicate via Port 443, the standard port for HTTP over TLS/SSL.
Microsoft positions Azure Stack as a key differentiator versus other major public cloud providers, in that customers can run an integrated hardware and software system that is supposed to offer the exact same platform as Microsoft's Azure public cloud, but in a private datacenter. The approach enables customers to use the same application code in the public cloud and on the private cloud.
Early demand for the technology includes edge environments, disconnected environments, customers with specialized security requirements and those with specific compliance concerns. Hardware partners currently offering the 4-12 node integrated systems include Cisco, Dell EMC, Hewlett Packard Enterprise, Huawei and Lenovo.
Because it runs the same underlying code as Azure in the public cloud, Azure Stack supports a number of Azure services. Up until now, Microsoft has added the functionality for each service to its Azure Stack portal via a portal extension using a separate network port.
In a blog post on Friday announcing the change, Thomas Roettinger, senior program manager for Azure Stack, acknowledged customer pushback for managing and securing multiple ports. "As the number of Azure services increases, so do the number of ports that must be opened on a firewall that supports Azure Stack," Roettinger said.
Following in Azure's footsteps, the Azure Stack will soon adopt a so-called Extension Host technology to funnel all the ports through Port 443. "In its first release, the User and Admin portal default extensions have moved to this model, thereby reducing the number of ports from 27 to one. Over time, additional services such as the SQL and MySQL providers will also be changed to use the Extension Host model," Roettinger said.
The change will be fully implemented with the 1810 update of the Azure Stack. In preparation, Azure Stack customers will need to import a pair of wild card SSL certificates, one for the admin portal and one for the tenant portal.
The current build, 1807, was only released a few days ago, and Roettinger suggested users have some time to prepare. New deployments of Azure Stack will require the wild card certificates sometime in September, he said.
Posted by Scott Bekker on 08/13/2018 at 12:15 PM0 comments
As Microsoft steadily shifts its focus to the cloud and ramps up release cadences for all of its code, the future of Windows is a pressing question.
At the TechMentor conference this week in Redmond, Wash., a panel of expert speakers gazed into their own crystal balls to assess the future of the operating system, both for end users and in on-premises servers. Their lively discussion touched on controversies surrounding the Windows 10 update cycle, looked ahead to features coming in Windows Server 2019 and exchanged best bets for IT pros to keep their skills current.
TechMentor co-chairs Sami Laiho and Dave Kawula moderated the panel on Wednesday that featured session presenters Peter De Tender, Petri Paavola and Orin Thomas, all of whom are current or recent Microsoft Most Valuable Professionals (MVPs). The TechMentor conferences are a production of Redmondmag.com's parent company, 1105 Media Inc.
Laiho steered the discussion into the recent controversies over the Windows 10 update and patch process, where simmering discontent among administrators responsible for patching systems came to a head with the problematic July security updates.
Several panelists agreed that Microsoft's fast cadence of security fixes and feature updates was great in theory, but that IT pros justifiably lack confidence in the updates due to recent events.
"The goal has to always be that the update needs to be fast. It should be just, 'OK, now I have the big update, this may take a few minutes more time, but everything is working.' In real life, we have been seeing many problems with the new versions," Paavola said. "Hopefully in the next one or two years, we won't need to have these conversations."
Thomas argued that in addition to improving the patch release quality over time, Microsoft needs to improve its messaging for enterprises about the importance of features in the twice-a-year Windows 10 releases.
"If the messaging was getting there as to why these new features are important, that would be good. The reality is, most organizations aren't necessarily using all of the existing features, so giving them new features isn't necessarily a win," Thomas said. "I think on the enterprise side, we've still got to get people using things like Credential Guard and Device Guard and Windows Defender ATP."
Thomas also believes Microsoft could reduce resistance by making the current update process more flexible. "One of the other challenges around updates has been also the feeling of lack of control over when the updates install," he said. While Thomas says Microsoft is justified in its concern for keeping users updated, especially on security fixes, the current attitude is grating: "You're getting this update and we're going to install it when we want, and you've got no choice in the matter."
Swinging the pendulum back a little bit so users have a notification icon in their taskbar, where they can slightly delay an update, might go a long way, he said.
Earlier, moderator Kawula queried Thomas about what elements of the on-premises Windows Server 2019 release, scheduled for later this year, he was most excited about.
Thomas highlighted improvements to the Guarded Fabric/Shielded Virtual Machine features and container features that first appeared in Windows Server 2016.
"Not only are you going to be able to run shielded Windows VMs and encrypted Windows VMs, you're going to be able to better run shielded VMs running Linux and encrypted VMs running Linux," he said. "We're going to see much more of this story about Windows Server being the fabric on which you can run your heterogeneous workloads. Windows Server 2019 is very much going to be a great environment if you want to run on-prem Linux virtual machines. Not only that, you can install Windows Subsystem for Linux on Windows Server 2019, and you're going to be able to run Linux containers side-by-side with Windows containers."
Those types of improvements represent an important way to think about future on-premises releases of Windows, Thomas suggested. While he predicted Microsoft will continue to release robust Windows Server products for years, expect them to be evolutionary. The revolutionary new features will probably be reserved for Azure.
Discussing skills that IT professionals need to hone to keep current, Laiho emphasized that the critical technical abilities for forward-looking administrators are PowerShell, PKI and IPv6. "And you have to accept the cloud," he said.
De Tender agreed that IT jobs will remain relevant, so long as those who hold the jobs tweak their on-prem skills for the cloud. "You still need architecture design. If you're the network expert today, if you deploy workloads in Azure -- no matter what virtual machines, PaaS, IaaS -- you still need networking concepts. If you're the SQL DBA or any flavor DBA...if you move your database to the cloud, you still need to have your database knowledge," he said. "It's not that your job will somehow all of a sudden stop, but you need to be flexible, you need to adapt."
Posted by Scott Bekker on 08/09/2018 at 3:46 PM0 comments
If you've turned away from OneDrive or stopped paying attention to Microsoft's file hosting service, it may be time to take another look.
Stephen Rose, senior product manager for OneDrive for Business, provided an update on the service's roadmap during the main keynote at the TechMentor conference (pictured above), being held this week at Microsoft's campus in Redmond, Wash. The TechMentor conference is hosted by Redmondmag.com's parent company, 1105 Media Inc.
Rose, a longtime veteran of Microsoft who has also held roles on the Windows client team, made the case for how central OneDrive has become within the Microsoft productivity and collaboration stack.
"We look at [OneDrive] as the place to share and work with all of your files. It is the backbone for SharePoint, for Office, for Teams, for Stream, for all of those apps as the way to integrate and move," Rose said Tuesday.
Meanwhile, he said that organizations that went with Box or Dropbox over the last few years might be surprised at the pace of improvement in OneDrive, which has come a long way from a time when it was still based on the Groove.exe client, lacked external sharing and didn't support previewers. He highlighted recent customer successes at Rackspace and MGM Resorts International, which are both leveraging Microsoft product stack synergies to save money or upgrade capabilities.
Rose spent most of his talk going point by point through significant updates that are in process for OneDrive. He briefly covered about a dozen enhancements since last year, including Secure Internal Sharing, Mac Office Sync Integration, Hover Card, Files Restore, Hold Your Own Key, Copy/Move from OneDrive to SharePoint, GDPR compliance, and support for more than 320 file previewers. Then he moved on to detail more than a dozen new features of OneDrive that are currently rolling out or will arrive in the next few weeks. "There's nothing I'm talking about here that's going to land later than September," he promised.
Among new features that Rose listed as either generally available or in the process of rolling out to current users were Known Folder Move, Camera Upload, Customization of Sharing Emails and Transfer Ownership.
Rose spent the most time on Known Folder Move, or KFM, which is a major component in Microsoft's strategy for helping IT back up and restore or migrate end users' desktops.
"With the [Known Folder Move] update, what you'll see is a new tab called Auto Save," Rose said. If IT enables it, end users can go in and start the process themselves, or IT departments can use a Group Policy Object (GPO) to push out a request that end users protect their folders. Either way, KFM creates the Desktop, Pictures and Documents folders in the cloud and synchronizes the users' content in both places. Users can continue to work while the content synchronizes.
Microsoft has already pushed KFM to all managed devices for users with less than 10GB of total files, which Rose said was about 30 percent of users. "We're going to open it up to everybody by the end of September," he said. One wrinkle is Microsoft enthusiasts with both business and consumer OneDrive accounts can't double up their synchronization. "If IT has said you do this for OneDrive for Business, you're not going to be able to do it for Consumer because you don't have two separate desktops and two separate photos, and it would create a lot of confusion," he explained.
The other new features that Rose discussed that were in various phases of general availability or some sort of phased rollout included the following:
- Camera Upload allows a user to pick a business account for a camera upload from a smartphone. In addition to more storage room and Office integration, the feature allows companies to store and own pictures that are company property, rather than relying on users to manage the images in their personal device photo collection. Microsoft has also added support for SD cards in Android.
- Customization of Sharing Emails is a benefit for customers with E1, E3 and E5 plans. They can display their tenant logo when sharing through OneDrive.
- Transfer Ownership is a feature for when an employee leaves the company or their current role. Rather than having the employee's OneDrive files go to their supervisor, organizations can now choose delegates. The feature has also been introduced for SharePoint and Exchange.
Among the features Rose highlighted that aren't publicly available yet but are coming in the next two months, several significantly advance OneDrive capabilities.
Users will get more control over securely sharing files externally. "We've had some requests because the external sharing with the password protection is great, but we've had some folks saying I want to be able to set my own password when I'm sharing something externally," Rose said. A feature called Password Protected Links will allow an end user to pick a password and share it via instant message, text or by phone. Another feature called Block Download, which can be turned on by an administrator, can allow the user to prevent the person their sharing the file with from downloading or printing the file.
Administrators and managers will be able to audit the external sharing trail, as well, with a feature called External Sharing Reports. Running the report shows everything that's been shared externally, who shared it, what the rights were and what the access was.
The OneDrive team is working with the Microsoft Intune team on a set of Intune policies, so that administrators can conduct administration on OneDrive through that management tool rather than through GPOs, if they choose.
Additional work is being done to make the sharing user experience through OneDrive Mobile more exactly match the experience on the desktop, the Web and the Mac.
A new scan experience leveraging the Microsoft Office Lens tool is being added to allow for better scanning and being integrated with Flow for processes, such as connecting receipts to expense approvals.
Posted by Scott Bekker on 08/08/2018 at 3:45 PM0 comments
The smallest of Microsoft's "Surface" family of 2-in-1 devices is now available in the United States and Canada, three weeks after it was first announced.
The Surface Go is being sold in Microsoft Stores, Best Buy and through reseller partners. It sports a 10-inch screen, weighs 1.15 pounds, is a third of an inch thick and runs a 7th Generation Intel Pentium Gold Processor 4415Y. The unit ships with Windows 10 S and a 30-day home trial of Office 365 Home, and its ports and jacks support USB-C, Surface Connect, Surface Type Cover, headphones and a microSDXC card.
The entry-level model comes with a 64GB eMMC drive and 4GB of RAM for $399. The higher-end version has a 128GB solid-state drive and 8GB of RAM for $549. Much of the signature functionality of Surface devices requires additional purchases, such as the Surface Go Signature Type Cover for $99 and the Surface Pen for $99.99. A new Surface Mobile Mouse costs $34.99.
The two versions showed up in stock at Microsoft Stores around the United States and Canada, but the release represents the first phase of a multi-stage release plan. In a blog, Yusuf Mehdi, corporate vice president of the Windows and Devices Group, said Surface Go would be available in other countries later this month.
Much of Mehdi's post focused on how people are more comfortable knowing that a laptop is nearby when they're on vacation, and that satisfaction with straightforward tablets is on the decline.
When the pricing is taken into account, Microsoft is basically selling a tablet for as little as $400 with Surface Go, but pitching the base capability delivered by that tablet with the keyboard/cover for a real base price of $500. The full configurations with keyboard/cover, pen and mouse will run either $635 for the 64GB model or $785 for the 128GB model.
The flagship version that CEO Satya Nadella called attention to last month at the Microsoft Inspire conference won't be available until later this year. That version will include LTE connectivity, in addition to the Wi-Fi and Bluetooth built in to the other models. Pricing has not been disclosed for the LTE Surface Go. Discussing his home and work productivity setup at Inspire, Nadella said he had an early-access version of the LTE model.
Posted by Scott Bekker on 08/02/2018 at 3:14 PM0 comments
Customers who don't want to follow Microsoft into the cloud for their Office productivity products will be paying more for that on-premises option starting next quarter.
Microsoft announced several licensing changes this week that will go into effect on Oct. 1.
The clearest change is a 10 percent hike in Office 2019 commercial prices, to include the Office client, Enterprise CAL, Core CAL and server products. Office 2019 is expected to ship later this year. Microsoft also released preview versions of several Office 2019 servers earlier this week.
The pricing change is one of several ways that Microsoft has been constraining the on-premises version of Office as it tries to steer customers to the subscription- and cloud-based Office 365. Earlier this year, Microsoft announced a shorter support lifecycle for Office 2019, which will have five years of mainstream support and two years of extended support. The regular Microsoft support cycle calls for five years of extended support, so 10 years of total support rather than the seven being offered for Office 2019.
Additionally, Microsoft has said Office 2019 will only be supported on Windows 10, not Windows 7. That limitation occurs even though the Windows 7 lifecycle doesn't end until January 2020, a full year after the Office 2019 release.
There is also a price increase in Windows 10. Microsoft announced it is renaming the Windows 10 Enterprise E3 offers and raising the price of the per-device version to match the per-user version. The E3 name will now refer only to the per-user offer. That means Windows 10 Enterprise E3 per User becomes Windows 10 Enterprise E3, and Windows 10 Enterprise E3 per Device becomes Windows 10 Enterprise.
Microsoft unveiled several other broad changes to its volume licensing programs:
- Establishing a single, consistent starting price across all programs aligned to web direct for online services (OLS)
- Removing the programmatic volume discounts (Level A and Open Level C) in Enterprise Agreement (EA)/EA Subscription, MPSA, Select/ Select Plus, and Open programs (Open, Open Value, Open Value Subscription)
- Aligning government pricing for on-premises and online services to the lowest commercial price in EA/EAS, MPSA, Select Plus, and Open Programs
- Delivering a newly designed Customer Price Sheet that better outlines how a customer's price was derived (direct EA/EAS only)
With its announcement post, Microsoft positioned the changes as part of a "Modern Commerce" strategy. "These changes will highlight the benefits of our pricing for a cloud-first world, help us move from program-centric to a customer-centric pricing structure, and create more consistency and transparency across our purchasing channels," read the post, which was attributed to the MPN (Microsoft Partner Network) team.
In a series of Tweets, Directions on Microsoft analyst Wes Miller called the Office licensing adjustment an important change to note.
"If you add all of these motions up, and look at other lightly announced price increases, it clearly points toward encouraging customers that have avoided licensing Office 365 (or now Microsoft 365) to look again," Miller wrote under his Twitter handle @getwired.
Additionally, Miller, who follows licensing closely at Directions, called the newly released Office price changes, along with the shorter support lifecycle, the first two shoes to drop, with two other changes likely later.
"Possible shoe 3: Highly likely that Office 2019 Professional Plus won't feature roaming rights, and that it will drop them over the next 3 years, as Windows 10 did (a process that completes in early 2019). Meaning if you have server-based desktops of any kind, it's ProPlus time," Miller wrote. "Possible shoe 4: Will there even _be_ an Office 2019 Standard? Less and less differentiating value between it and Professional Plus - we've long thought it might be dropped."
Posted by Scott Bekker on 07/27/2018 at 12:29 PM0 comments
The countdown clock is starting on a standard desktop configuration.
Much like it beat a drum around the end-of-life deadline for Windows XP, Microsoft is now starting
to warn customers and partners that a popular desktop configuration's time is limited.
The new focus of the upgrade talk is Windows 7 and Office 2010. The last big cycle was around the extremely popular Windows XP desktop, which went out of extended support in April 2014. End of support for XP's follow-on, Windows Vista, arrived three years later, but was not a big deal given Vista's comparatively low adoption.
Ron Markezich, corporate vice president for Microsoft 365, started the Windows 7/Office 2010 conversation during a keynote last week at Microsoft Inspire. "This move over the next three years represents a $100 billion opportunity for all of our partners," Markezich said at Microsoft's biggest annual gathering for partners from around the world.
The deadlines for extended support on the two products are Jan. 14, 2020 for Windows 7 and Oct. 13, 2020 for Office 2010.
It makes sense for Markezich to be the one to start the drumbeat this time around. When XP was hitting end-of-life, the biggest issue was security and compatibility. While XP was incredibly stable for a Windows release and offered solid performance on many applications until late in its lifecycle, Microsoft was having trouble keeping the aging code up to date with evolving security threats, and compatibility with newer devices and applications was becoming problematic.
For this cycle, the move from Windows 7 represents a shift -- should users stick with Microsoft -- to the constantly upgraded Windows 10 on the OS side. On the Office side, the push will be to get users from the desktop Office paradigm into an Office 365 subscription and the cloud.
Both are under Markezich's purview. He runs the Microsoft 365 business, which is the core of Microsoft's Modern Desktop initiative and includes Windows 10, Office 365 and Enterprise Mobility + Security (EMS).
Expect a hard sell from Microsoft from now through 2020 to get those desktops moved to Windows 10 and Office 365.
Posted by Scott Bekker on 07/25/2018 at 11:37 AM0 comments
Microsoft has wrapped up its Inspire conference this month in Las Vegas, where it combined the annual partner event with its annual internal sales event, Microsoft Ready. Here are top quotes from Inspire keynotes that hit the key themes of the show, which marks the start of Microsoft's fiscal year.
Intelligent Cloud/Intelligent Edge
"The opportunities for us to serve our customers in this new era of the intelligent cloud and the intelligent edge is far greater. I've sort of lived through the client-server, the Web, mobile, cloud, but what we're going to see going forward is going to be even more profound." --Satya Nadella, CEO, Microsoft
During his keynote last Wednesday, Nadella walked attendees through a number of customer scenarios that hit on the intelligent cloud or the intelligent edge. The messaging covers the Azure platform in all its iterations from the public cloud to Azure Stack to Azure Sphere, and includes the artificial intelligence (AI) that is built into many Azure services.
Focus on Privacy
"Privacy is a fundamental human right. ... Our fundamental value proposition for our customers, not just as a company, but as a community, that we will help them better protect the privacy of their customers." --Brad Smith, President and Chief Legal Officer, Microsoft
In a keynote, Smith reiterated Microsoft's commitment to improving privacy and security for customers, while also trying to take a thought leadership role in the ethics of AI.
Modern Workplace Refresh
"This move over the next three years represents a $100 billion opportunity for all of our partners." --Ron Markezich, Corporate Vice President, Microsoft 365
Markezich runs the Microsoft 365 business, which is the core of Microsoft's Modern Desktop initiative and includes Windows 10, Office 365 and Enterprise Mobility + Security (EMS). Markezich was describing the opportunity to sell Microsoft 365 and associated devices due to the upcoming upgrade cycle around end of support for Windows 7 and Office 2010.
"I was very excited to welcome the Windows team into Azure, and we're one unified group now. So, everything from silicon at the base level, firmware, operating systems, everything in the middle of the stack up through app stack to tools, we've got everything on one team to go deliver on this vision." --Jason Zander, Executive Vice President, Microsoft Azure
During his keynote about Azure for partners, Zander made the case for why the Windows team was consolidated under his engineering unit.
"We've built over 28,000 solutions, services and applications with you this past fiscal year. We have generated over 3 million leads out to partners and jointly developed over 100,000 co-sell opportunities. And, get this, we have landed over $5 billion in partner sales. That's your sales, not Microsoft's sales, $5 billion. And the best thing about that is that that is with 87 percent of IP co-sell ready partners participating in that motion." --Gavriella Schuster, Corporate Vice President, One Commercial Partner, Microsoft
From being announced a year ago, the Microsoft co-selling program with partners gained a lot of steam in fiscal year 2018. Schuster said Microsoft will continue to reward Microsoft sellers for selling partner solutions at 10 percent of partners' contract value at least through June 2019.
"The amount of interconnect cable we have across Azure datacenters is enough to go to the moon and back three times over. It's crazy." --Satya Nadella
The Inspire keynotes are always a time for metrics about the scale of Azure. Nadella talked about the cabling in the context of Microsoft having added 14 datacenter regions over the last fiscal year to bring the total to 54 worldwide. Elsewhere during the show, Microsoft shared that its private network includes over 4,500 peering locations and 130 edge sites.
"Today is a momentous milestone for us, bringing together these two communities, Inspire and Ready, because that's how our customers see us, as one. And to be able to kick off our fiscal year is something that I think is going to really mark a real difference in how we as this tech community are going to serve our customers going forward." --Satya Nadella
This statement during Nadella's Wednesday keynote captured the main reason for combining the two conferences for the first time.
"I always say this to any student who is joining Microsoft or looking to join Microsoft. I say to them, 'Look, if you want to be cool, go look for someplace else. But if you want to join a company that is committed to making others cool, join Microsoft.'" --Satya Nadella.
After decades in Redmond, Nadella really put his finger on the pulse of what it means to work at Microsoft or be in the Microsoft ecosystem. The statement was a big applause line in Las Vegas.
Posted by Scott Bekker on 07/23/2018 at 12:54 PM0 comments
Microsoft's latest fiscal year earnings are in, and it's official: Microsoft is now a $100 billion company.
According to financial results released after markets closed on Thursday, the company earned revenues of $110.36 billion for the full year ended June 30, a 14% jump compared to the $96.57 billion in revenues the previous year.
"We had an incredible year, surpassing $100 billion in revenue as a result of our teams' relentless focus on customer success and the trust customers are placing in Microsoft," said CEO Satya Nadella in a statement accompanying the financial results. Nadella called out the company's early investments in intelligent cloud and intelligent edge as paying off.
CFO Amy Hood added that sales execution and commercial cloud revenue growth were key to strong results in the fourth quarter.
For Q4, Microsoft reported a 17% revenue gain to $30.1 billion and earnings per share of $1.13. Both figures beat analyst expectations and the stock was up slightly in after-hours trading.
Earlier in the week at the Microsoft Inspire conference for partners, company officials modeled estimates of total Microsoft ecosystem value, including partner revenues, based on a Microsoft annual revenue base of $100 billion.
By major business units, the fourth quarter revenue was up 13% to $9.7 billion in Productivity and Business Processes, up 23% to $9.6 billion in Intelligent Cloud, and up 17% to $10.8 billion in More Personal Computing.
Among the more granular product and service highlights:
- Commercial cloud gains were slightly less impressive than the previous sequential quarter. This business was up an impressive 53% year over year, but the figure was 58% for the third quarter. Commercial cloud revenues totaled $6.9 billion and include commercial versions of Office 365, Dynamics 365 and the Azure public cloud. Azure revenues on their own were up 89%, Dynamics 365 revenues were up 61% and Office 365 commercial revenues were up 38%.
- Windows OEM revenue was up 7%, driven by 14% growth on the OEM Pro side and offset by a 3% drop on the non-Pro side. Revenues for Surface shot up 25%, with Microsoft noting that new editions are faring well against a low prior-year comparable. For the full year, the Surface unit generated $4.6 billion in revenues.
- Enterprise services revenues also tell an interesting story in the fiscal year. Always a point of interest for channel partners who often compete with Microsoft consultants, the service business over the past fiscal year has seen steady growth. In fiscal year 2017, enterprise services posted a 2% decline in revenues compared to the year before. Yet for FY18, revenues were up 1% in Q1, 5% in Q2, 8% in Q3 and 8% again in Q4. For the full year, enterprise service revenues increased 5%.
Posted by Scott Bekker on 07/19/2018 at 3:26 PM0 comments
In his keynote at Microsoft Inspire on Wednesday, Microsoft CEO Satya Nadella detailed the Microsoft products and tools he uses to stay productive.
There used to be a regular saying that Microsoft's IT department was the company's first, best customer.
The idea was the department was always at the ready to dogfood technical previews and beta versions of Microsoft's enterprise software and services. Using the software to run a 100,000-person company with nearly $100 billion in revenues and global-scale operations is a great way to kick the tires and prove the scalability of new software.
When it comes to the company's flagship productivity software and services, there's a similar idea -- how does the CEO of Microsoft use Office, Windows and other tools to get more done every day? It was a source of fascination for customers and partners in Bill Gates' day and in Steve Ballmer's day. Now the company's third CEO is sharing his tips and tricks. On Wednesday, Nadella shared some details about how the company's highest-profile internal user personally leverages Microsoft 365 in his daily work.
As such demos are effectively an advertisement for Microsoft's latest-generation products, Nadella first sought to create some device envy among the thousands of Microsoft partners and employees in attendance at Inspire. "I have a Surface Studio at work and at home. In fact, Surface Go has really been a game changer for me. I have this early access Panos [Panay] gave me over LTE, it's just awesome," Nadella said. While pre-orders are currently being taken for the first models of the Surface Go, which is a smaller and lighter version of the Surface 2-in-1, they aren't shipping until Aug. 2. But for the LTE version that Nadella says he is using, a release timeframe hasn't been discussed yet.
Most of Nadella's demos involved an Android phone, an iPhone and a Surface as a computer. About both phones, Nadella joked that to him they were just "Microsoft 365 endpoints." To support that idea, he showed how the screens for both the Android device and the iPhone were filled with icons for Microsoft apps.
To the common question of whether customers should use Teams or Yammer, Nadella's workflow provided an interesting answer: both.
"What I want to start with is my communications diet," Nadella explained. "I use three things throughout the day. I use Outlook as my open loop. This is my ability to communicate with any one of you, or anyone inside the company. Microsoft Teams, that's my inner loop. That's how I stay in touch with the groups, as well as the projects that I'm closely working on and closely working with. Then Yammer, that's my [outer] loop. That is my ability to make sure I'm in touch with [what] the 100,000 [Microsoft employees] are really buzzing about."
While using the Android phone, Nadella gave a hard sell for Microsoft's Outlook app. "By the way, if there's one thing I will ask all of you to do, it's download Outlook, it will change your life. It's super helpful in your ability to stay productive," said Nadella, adding later, "Outlook is the best Gmail client. If you don't trust me, check it out."
He showed how he uses the Outlook app ability to triage e-mail with flags, relies on Focused Inbox heavily, and uses new "do not disturb" functionality for events like Inspire. "The other thing that we just added recently is do not disturb. Especially when you're at an event like this, and you're getting all these e-mail notifications that are trying to attract your attention, you can make sure that you're not distracted," he said. He also showed how he uses Outlook as a universal client for his Outlook.com e-mail, his Office 365 e-mail and his Gmail.
Aside from using Outlook, Teams and Yammer to monitor his three "loops" throughout the day, Nadella called out his own use of LinkedIn, Cortana, To Do, Bing, Edge and Stream.
He presented LinkedIn as almost a fourth information loop, where he goes regularly to get industry-specific news and updates from his professional contacts. For Cortana, he highlighted Cortana Commitments, calling it a feature that "saves me" every day. "I send mails to somebody saying I'll follow up tomorrow. And then, of course, I forget to put it in To Do. But the one thing that Cortana does is it remembers."
While Bing and Edge are likeliest to get eyerolls, Nadella brought up interesting use cases for both Microsoft's Google-lagging search engine and its also-ran browser. In a better-together scenario, Nadella showed the power of being logged into Azure Active Directory with Bing's new indexing capability. Nadella conducted a Bing search for Microsoft channel chief Gavriella Schuster, and Bing displayed her internal corporate profile and presented a Microsoft campus map with a pin in Schuster's office on a floor plan of her building. With Edge, meanwhile, he demonstrated the ability to view a news article on his phone and then move that page to display on his Surface device.
Use of Power BI represents an organizational shift at Microsoft that affects individual users' daily work. "If there's one tool that's changed the culture inside the company, perhaps Power BI is the one I'll point to. Because one of the things that we're trying to do is, how do we move away from all these lagging indicators of success but fall in love with leading indicators of success, like usage or consumption or satisfaction?" Nadella said in demonstrating the app's graphical displays.
Finally, Nadella demonstrated Microsoft's Stream technology as one of his tools for quickly reviewing company video events for points of interest. On stage at Inspire, Nadella used Schuster's Tuesday keynote, which had been transcribed and timecoded in a searchable section next to the video display.
After describing his daily use of Microsoft 365, Nadella challenged partner and Microsoft field sales employee attendees to take the bundle to the market: "The opportunity for everyone here is to take Microsoft 365 and apply it for cultural transformation in large enterprises; for productivity in small businesses; to be able to really do industry-specific workflows in health care, in manufacturing, in financial services; to be able to take it to firstline workers; [and] to extend it to business processes with Dynamics 365."
Posted by Scott Bekker on 07/18/2018 at 3:13 PM0 comments
Microsoft will highlight new features and programs around Azure and Microsoft 365 during the Microsoft Inspire 2018 partner conference that kicks off on Sunday.
In advance of the partner conference, which runs most of next week and will be co-located in Las Vegas with the Microsoft Ready internal sales conference, the company on Thursday made dozens of product and partnering announcements.
The announcements should form a rough outline of the major topics and themes Microsoft that will focus on at the conferences. However, it's likely that Microsoft is holding a few major revelations back for Microsoft keynote speakers throughout the week, including Jason Zander, executive vice president for Azure; Ron Markezich, corporate vice president for Microsoft 365; Anand Eswaran, a corporate vice president in the enterprise business; and CEO Satya Nadella.
Many of the biggest announcements involve the flagship Azure cloud platform, and will be featured on Tuesday. Also likely to be featured in the Tuesday keynote lineup are a series of announcements involving Microsoft 365, which is Microsoft's term for the technology bundle that includes Windows 10, Office 365 and Enterprise + Mobility Security (EMS).
For Azure, Microsoft is rolling out several significant previews. One is Azure Data Box Disk for moving data into Azure. Building on the Azure Data Box appliance for data migrations, the Data Box Disk is an SSD-disk based option for migrating up to 35TB for either one-time or recurring migrations. Meanwhile, availability of the original Azure Data Box is being expanded to a preview version in Europe and the United Kingdom.
New Azure services entering preview include Azure Virtual WAN and Azure Firewall. The Virtual WAN networking service provides optimized and automated branch-to-branch connectivity, and provides mechanisms to connect on-premises routers and SD-WAN systems, according to a blog post by Zander. As for the firewall, Zander described it as, "a fully stateful firewall as a service with built-in high availability and unrestricted cloud scalability."
A full general availability release on Thursday was a next-generation version of Azure SQL Data Warehouse with doubled query performance, optimizations for data movement and the ability to support up to 128 concurrent queries. On the Power BI front, the data-related cloud service received several enhancements to make it more practical for business analysts to work with Big Data. For customers with on-premises versions of Windows Server and SQL Server 2008/2008 R2, Microsoft unveiled an offer that would allow them to migrate those workloads to Azure and get critical security updates for them past the end-of-support deadline at no charge.
As another way to assist customers in moving to Azure, Microsoft unveiled a new category of specialized managed service provider (MSP) partners with expertise in Azure migrations. "These expert partners have proven real world proficiency and skills, for datacenter lift-and-shift, born-in-cloud new applications, and everything in-between," wrote Corey Sanders, corporate vice president of Azure, in a blog post about the new program, which is called the Azure Expert Managed Service Provider program.
Sanders detailed the requirements for MSPs to join and remain in the program. "Azure Expert MSPs complete a rigorous audit by an independent third party, and also provide multiple customer references of Azure managed services projects delivered over the last 12 months. Furthermore, to retain the badge, these expert partners need to continue to meet pre-requisites annually and complete a progress audit every year," Sanders wrote.
Under the Microsoft 365 umbrella, Microsoft announced several end user-focused enhancements around Inspire. First among those is a free version of Microsoft Teams that is remarkably robust and available immediately in 40 languages. Features in the free version include support for up to 300 people, unlimited chat messages, search, built-in audio and video calling for individuals and groups, 10GB of team file storage, an additional 2GB per person of personal storage, and real-time content creation integration with Office Online apps.
In an e-mail, Dux Raymond Sy, CMO of AvePoint, a major Microsoft SharePoint ISV partner, called the announcement a big blow for Microsoft against Slack. "With this new freemium model, it's hard to see how smaller organizations would choose Slack for their chat based collaboration over the superior integration and security features that Microsoft Teams provides," he said.
Another major new capability within Microsoft 365 is intelligent events. Calling them artificial intelligence-powered, Microsoft said the event infrastructure is designed to allow anyone in an organization to create live and on-demand events. Enhancements to the event experience include a speaker timeline using facial detection to identify speakers, speech-to-text transcription with timecoding and closed captions.
In an effort to help organizations improve efficiency and enforce work-life balance, Microsoft announced features called Workplace Analytics and MyAnalytics nudges. Using Office 365 data, Workplace Analytics identifies collaboration patterns that impact productivity, workforce effectiveness and employee engagement, according to Microsoft. The nudges, meanwhile, are aimed at encouraging employees to reduce after-hours impacts on co-workers, preserving blocks of "focus time" in employees' schedules and running more effective meetings. MyAnalytics nudges will start to appear in Outlook starting this summer.
One product that hit general availability on Thursday is the Microsoft Whiteboard app for Windows 10, which Microsoft describes as a "freeform, intelligent canvas for real-time ideation, creation and collaboration." The app was previewed in December, and more previews will be on the way for iOS and Web versions of Whiteboard.
Posted by Scott Bekker on 07/13/2018 at 8:42 AM0 comments
Microsoft on Tuesday began taking orders for a new, slightly smaller and slightly less expensive Surface device that will start shipping Aug. 2 in several markets, including the United States.
Panos Panay introduced the Surface Go in a blog post Monday night. "Starting at $399 MSRP, it represents a new entry point for the Surface family, while keeping the premium qualities that have come to define it," Panay said.
At its size and price, the idea is a machine that's good for entertainment and educational use but fully capable of tackling most work tasks either on the road or at home.
"Being able to run Office apps on this device with its portability is one of the things that was critical to the experience we had in mind when we designed Surface Go -- the productivity of having the apps you use for work and school with the flexibility to relax and read or watch a show on Netflix or Hulu," Panay said.
The latest device offering from Redmond has a 10-inch screen, weighs 1.15 pounds, is a third of an inch wide and runs a 7th Generation Intel Pentium Gold Processor 4415Y. The unit ships with Windows 10 S and a 30-day home trial of Office 365 Home.
Headlining the included ports is a USB-C jack, an interface that Microsoft only recently made available for some other Surface products via a dongle. The Surface Go also includes a Surface Connect port, a Surface Type Cover port, a headphone jack and a microSDXC card reader. The system has Wi-Fi and Bluetooth built in, as well.
Surface Go comes in two versions. The $399 version has 64GB of eMMC storage and 4GB of RAM. A $549 system has 128GB of SSD storage and 8GB of RAM. Both ship on Aug. 2 in the United States, and pre-orders are also available in Canada, Australia, New Zealand, the United Kingdom, Ireland, France, Germany, Austria, Belgium, Luxembourg, the Netherlands, Switzerland, Denmark, Finland, Norway, Sweden, Poland, Italy, Portugal and Spain.
Next on the list for pre-orders over the coming weeks are Japan, Singapore, Korea, Taiwan, Malaysia, Thailand, Hong Kong and China. Other markets will follow later, the company says.
While the base price is relatively low, many of the most important aspects of the Surface Go experience require accessories. A Surface Go Signature Type Cover, like other Surface Type Covers, provides a connected hardware keyboard and folds up to an ergonomic angle. Available in burgundy, platinum, cobalt blue or black, the Surface Go Signature Type Cover costs $99.99. A Surface Pen is already available in the same four colors and also costs $99.99.
Microsoft also unveiled a Surface Mobile Mouse for $34.99 in burgundy, platinum or cobalt blue. The new type cover and mobile mouse both start shipping Aug. 2. Another accessory, the Surface Dial, is also supported by the Surface Go, but less essential for most Surface use cases.
Posted by Scott Bekker on 07/10/2018 at 9:09 AM0 comments
Cryptocurrency is still at the frontier of cybercriminal activity, but the vehicle to exploit it is taking a different form.
That's the conclusion of a new annual report from security researchers at Kaspersky Labs.
Preparing a third annual report on ransomware, Kaspersky noted that year-by-year, double-digit increases in ransomware didn't continue during the most recent study period of April 2017 to March 2018. After an April 2016 to March 2017 period when ransomware was the most significant security story of the year, the trend petered out in this last year.
"We have found that ransomware is rapidly vanishing," according to the Kaspersky report, which is based on anonymized data processed by the Kaspersky Security Network. The report, "KSN Report: Ransomware and malicious cryptominers 2016-2018," was released in late June.
Yet, a new criminal business opportunity made possible by cryptocurrencies is filling the ransomware vacuum. "Cryptocurrency mining is starting to take its place," the report said.
There was a 30 percent drop in the total number of users who encountered ransomware, year over year. Kaspersky logged about 1.8 million users running into ransomware in 2017 to 2018 compared to 2.6 million users the year before.
Conversely, users encountering miners rose nearly 45 percent, from about 1.9 million in 2016 to 2017 to 2.7 million during the year ending this April.
The emergence of cryptocurrencies like Bitcoin enabled an explosion in ransomware by making it relatively efficient and straightforward for criminals to collect untraceable ransom payments from victims. Kaspersky researchers believe many of the same organizations have transitioned to installing cryptomining malware on victim computers to create botnets that they can use to mine cryptocurrencies for profit.
"Miners are a discreet and modest way to make money by exploiting users, and are a far cry from the noisy and very noticeable encryption of victim devices. Instead of the large one-off payout achieved with ransomware, cybercriminals employing mining as a tactic can benefit from an inconspicuous, stable and continuous flow of funds," the report said.
The Kaspersky report is not the first to note the popularity of cryptomining attacks. Malwarebytes Labs in April released a quarterly report showing that cryptomining had shot up to be the second-most common attack threat for both consumers and businesses.
Kaspersky researchers also found effects on businesses. "In 2017 we started seeing botnets designed to profit from concealed cryptomining, and attempts to install miners on servers owned by organizations. When these attempts are successful, business processes suffer at victim companies, because data processing speeds fall substantially," the report noted.
While cryptomining attacks might be on the rise, Kaspersky also cautioned against taking ransomware lightly. "Ransomware is decreasing in volume. However, it is still a dangerous threat," according to the report.
Advising organizations to continue to beware is a good plan, given that the highest-profile recent ransomware attack, against the city of Atlanta, hit just at the tail end of Kaspersky's study period in late March.
Atlanta chose not to pay a ransom worth about $51,000 in cryptocurrency, and the city's mitigation costs so far have topped $2.6 million, according to reports based on public records. Earlier this month, a city official said in a public meeting that the aftermath of the ransomware attack will require another $9.5 million in unanticipated spending.
Posted by Scott Bekker on 07/05/2018 at 7:45 AM0 comments
One of Microsoft's most innovative contributions to productivity is on hiatus.
Since late last year, Microsoft has been talking about "Sets," a way to group related information from different applications into one project, and has included the feature in Insider Preview builds of Windows 10.
The technology essentially creates tabs within a window related to the same project. For example, if a user opens a Word document for a research project, Sets allows the user to create additional tabs from other programs within that window. One tab could be a Microsoft Edge browser page, another could be a PowerPoint deck, et cetera.
When the user closes the window, all of the components of the project are saved together, so the next time the project is opened, all the context would still be there. Integration with Office 365 and the Microsoft Timeline are supposed to make it easy to find other related content from within Sets.
Joe Belfiore, corporate vice president in the Microsoft Operating System Group, highlighted the feature at Build 2018 conference in May, although he said at the time that Microsoft would only release Sets when "we think it's great."
Apparently, Microsoft needs to take Sets back to the drawing board, rather than just tweak it. The company announced this week in the release notes for Windows Insider Preview Build 17704 that Sets is being pulled for now.
"Thank you for your continued support of testing Sets. We continue to receive valuable feedback from you as we develop this feature helping to ensure we deliver the best possible experience once it's ready for release. Starting with this build, we're taking Sets offline to continue making it great," noted the blog post from Dona Sarkar and Brandon LeBlanc in a reference to Belfiore's Build comment.
"Based on your feedback, some of the things we're focusing on include improvements to the visual design and continuing to better integrate Office and Microsoft Edge into Sets to enhance workflow. If you have been testing Sets, you will no longer see it as of today's build, however, Sets will return in a future WIP flight. Thanks again for your feedback," the post noted.
Posted by Scott Bekker on 07/02/2018 at 7:37 AM0 comments
That long-awaited USB-C dongle is coming to the Microsoft Surface.
A little over a year ago, Microsoft's Surface chief Panos Panay told technology news site The Verge in an interview that the dongle would be coming.
Now The Verge is reporting that a rather large dongle to connect USB-C devices will be available starting on Friday. According to the report, the dongle will cost $79.99, will be available for commercial customers and will plug into the Surface Connect port.
The Friday timeframe seems on track. A day after The Verge released its report, ZDNet and Redmond columnist Mary Jo Foley spotted a new reference to a "Surface Connect to USB-C Adapter" on Microsoft's Surface Web site.
Last year, Panay said he believed in USB-C, but that the right approach would be a dongle, since he didn't think it was the right move yet to replace any of the devices' limited number of other ports and connectors.
The Surface Book 2 already includes a USB-C port, along with two USB 3.0 Type A ports, a UHS-II SDXC card reader, a 3.5mm headphone jack and two Surface Connect ports.
The dongle will work with the first-generation Surface Book and with the Surface Pro. Surface Pro connections include USB 3.0, a 3.5mm headphone jack, a microSDXC card reader, Mini DisplayPort, Cover port and Surface Connect port. The first Surface Book has two USB 3.0 Type A ports, a headphone jack, a card reader, a Mini DisplayPort and two Surface Connect ports.
Posted by Scott Bekker on 06/26/2018 at 9:19 AM0 comments
Brian Krzanich is out as CEO of Intel after an investigation into a past consensual relationship with an Intel employee.
Intel announced Thursday morning that Krzanich, 58, was resigning his post and his seat on the Intel board. Chief Financial Officer Robert Swan was named interim CEO effective immediately, and Intel has begun a search for a permanent CEO.
According to a company statement, "Intel was recently informed that Mr. Krzanich had a past consensual relationship with an Intel employee. An ongoing investigation by internal and external counsel has confirmed a violation of Intel's non-fraternization policy, which applies to all managers. Given the expectation that all employees will respect Intel's values and adhere to the company's code of conduct, the board has accepted Mr. Krzanich's resignation."
Krzanich has been CEO at Intel for five years, and started at Intel as a process engineer in 1982. Swan has been CFO at Intel since 2016, and held similar roles previously at eBay Inc., Electronic Data Systems Corp. and TRW Inc.
The bombshell comes just five days before Intel's second-quarter earnings call. Intel had a rough start to the year with disclosure of the Spectre/Meltdown security issues, related class-action lawsuits and questions about a Krzanich stock sale before the security flaws were made public.
Yet the company had a strong Q1 and Intel as a company is bullish about the just-ended quarter. Alongside the statement about the resignation, Intel raised its guidance for Q2. Intel now expects adjusted earnings of $0.99 per share on $16.9 billion in revenue, up from a previous forecast of 85 cents a share on $16.3 billion in revenue.
Posted by Scott Bekker on 06/21/2018 at 9:01 AM0 comments
Since Microsoft co-founder Bill Gates stepped back from full-time involvement at the company where he amassed one of history's largest fortunes, he's become arguably the world's most influential philanthropist and social change-minded investor.
Now a company partly owned by Gates, Carbon Engineering, along with a team of researchers from Harvard University, are reporting a key tech breakthrough in a critical area of climate technology called direct air capture.
The idea is to pull climate-altering carbon out of the air and convert it to gasoline or other fuels. If the approach could be adopted at scale, it would provide a way to make existing transportation technologies like cars and jets carbon-neutral.
In other words, it could stop carbon dioxide buildup in the atmosphere without requiring that people and companies swap out all of their vehicles for cleaner technologies. The technique, based on widespread technologies, could also be used to capture carbon and trap it underground.
In the new scientific journal Joule, the team published research last week detailing how Carbon Engineering has been implementing the technique at a pilot plant in British Columbia. Critically, the paper details that the cost of removing a metric ton of carbon from the atmosphere can be done for less than $100. Previous expert estimates for the cost of the procedure had put it at the more prohibitive $600 per ton.
The Atlantic has a lot of detail on the study, the technique and the cautiously optimistic reaction of scientists not affiliated with the research.
Posted by Scott Bekker on 06/13/2018 at 8:21 AM0 comments
Microsoft on Wednesday unveiled big feature changes -- on a slow timetable -- for the Office user experience, including a simplified ribbon, new colors and icons, and a search overhaul.
The changes will roll out in stages over the next few months, starting with Web versions, and will be exclusive to Office.com and Office 365. Apparently remembering the significant user backlash that accompanied the original rollout of the Office ribbon, Microsoft is taking care to present the changes as a work-in-progress that will be tested with initial user groups and modified as necessary, rather than blasted out to the billion-plus monthly users of Office.
"We plan on carefully monitoring usage and feedback as the changes roll out, and we'll update our designs as we learn more," wrote Jared Spataro, corporate vice president for Office and Windows Marketing, in a blog post announcing the updates.
Additionally, user control over implementing the changes is a key design principle. "We want to give users control, allowing them to toggle significant changes on and off," Spataro said. In fact, there is no current schedule to push the ribbon changes to Word, Excel and PowerPoint for Windows. "They're the preferred experience for users who want to get the most from our apps. Users have a lot of 'muscle memory' built around these versions, so we plan on being especially careful with changes that could disrupt their work," he wrote.
The ribbon takes up less screen real estate and is more context-driven, with Microsoft trying to anticipate what a user wants to do. An underline appears below the main menu tabs, with options appearing underneath the tab in a single line that uses new colors and icons intended to provide more contrast and more intuitive navigation.
In a video demonstration, Jon Friedman, chief designer for Office at Microsoft, opened a Word document from the Office 365 interface to demonstrate a new speediness to the process. "Office is rebuilt on a modern platform to be faster than ever," Friedman said.
The new search functionality includes a more prominent search bar and brings in what Microsoft calls "zero query search," in which recommendations are presented based on Microsoft's predictive algorithms and data about the user's behavior and working relationships.
Microsoft shared several milestones in its timeline for rolling out the new Office features:
- All commercial users of Office.com, SharePoint Online and the Outlook mobile app have immediate access to the search changes.
- Select consumer users of Web versions of Word will see the simplified ribbon this week.
- The color and icon changes will come to the Web versions of Word shortly.
- Select Insiders using Windows versions of Word, Excel and PowerPoint will see color and icon changes later this month.
- Color and icon changes will come to Outlook for Windows in July.
- Select Insiders will see the simplified ribbon on Outlook for Windows in July.
- In August, commercial users of Outlook on the Web will see the search changes.
- Also in August, Outlook for Mac users will get the color and icon changes.
Posted by Scott Bekker on 06/13/2018 at 10:41 AM0 comments
Any enterprise customer of Microsoft's that isn't using GitHub already can expect a big push toward that software development and version control platform shortly.
Microsoft announced a deal on Monday to acquire GitHub for $7.5 billion in stock. The transaction has been approved by both companies' boards and is expected to close before the end of the year.
Microsoft CEO Satya Nadella identified enterprise usage as one of three major opportunities around GitHub in a blog about the deal.
"We will accelerate enterprise developers' use of GitHub, with our direct sales and partner channels and access to Microsoft's global cloud infrastructure and services," Nadella said.
Of the other two opportunities, one involved bringing Microsoft developer tools and services to GitHub's huge community of developers, many of whom primarily use open source tools. The other opportunity is deepening Microsoft's engagement with developers at every stage of the development lifecycle, Nadella said, "from ideation to collaboration to deployment to the cloud."
The GitHub community currently includes 28 million developers with more than 85 million code repositories. Microsoft regularly boasts of being the most active organization on GitHub. At the Microsoft Build conference in May, Microsoft said it had the most open source project contributors on the platform in 2016. With the acquisition announcement, Nadella claimed that Microsoft's 2 million "commits," or updates made to projects, make it the most active organization on GitHub.
Nadella's comments suggest that enterprise organizations, with development processes that pre-date the 10-year-old GitHub model, could face some adjustments in processes like version control that become integrated into Visual Studio and other parts of the Microsoft developer platforms.
Meanwhile, Nadella and GitHub Co-Founder Chris Wanstrath sought to assure GitHub's massive user community that big business-focused Microsoft wouldn't wreck the open source development platform.
"Most importantly, we recognize the responsibility we take on with this agreement. We are committed to being stewards of the GitHub community, which will retain its developer-first ethos, operate independently and remain an open platform," Nadella promised. "Developers will continue to be able to use the programming languages, tools and operating systems of their choice for their projects -- and will still be able to deploy their code on any cloud and any device."
In his own blog entry, Wanstrath pointed to Microsoft's ongoing engagement with the open source community and its handling of recent acquisitions as reasons to trust Microsoft's intentions. "Their work on open source has inspired us, the success of the Minecraft and LinkedIn acquisitions has shown us they are serious about growing new businesses well, and the growth of Azure has proven they are an innovative development platform," Wanstrath said.
Wanstrath will become a technical fellow at Microsoft once the acquisition closes, reporting to Microsoft Cloud + AI Group Executive Vice President Scott Guthrie. Nat Friedman, the founder of Xamarin, which was acquired by Microsoft in 2016, will become CEO of GitHub, also reporting to Guthrie.
Posted by Scott Bekker on 06/04/2018 at 2:39 PM0 comments
With a month to go until the end of Microsoft's fiscal year, the executive shuffles continue in the Windows business.
Microsoft CEO Satya Nadella unveiled a huge shift in a March 29 memo that revealed a reorganization whose headline departure would be the head of the Windows and Devices Group and longtime Microsoft veteran Terry Myerson. Myerson was to stay at Microsoft for a while to help with the transition.
The reorg was widely viewed as Nadella demoting Windows, the longtime centerpiece of Microsoft's strategy, to further emphasize artificial intelligence (AI), cloud computing and mixed reality.
To execute on the strategy, Nadella formed two new engineering units. One was Experiences & Devices, led by Corporate Vice President Rajesh Jha. The other was Cloud + AI Platform, led by Scott Guthrie, head of Microsoft Cloud and Enterprise.
All About Microsoft's Mary Jo Foley reported Thursday that the executive moves, even among those specifically named as continuing their positions in the March memo, are not finished.
Kudo Tsunoda is out as corporate vice president for Next Gen Experiences, Foley reported. Tsunoda had reported to Jha and was focused on mixed reality, 3-D, story remix, photos, HoloLens and other related projects. Foley reported that Tsunoda is looking for another role inside the company and the team has been disbanded and moved to other places.
Several teams in other areas are also being shifted around. Some teams on Executive Vice President Jason Zander's Azure and Windows engineering organization are moving into Jha's unit, and the design team and Windows Insider program are moving from Windows engineering into Corporate Vice President Joe Belfiore's Windows client experience team, Foley reported.
Posted by Scott Bekker on 06/01/2018 at 12:56 PM0 comments
Security researchers on Wednesday called on users of small office/home office (SOHO) routers and some NAS devices to reset to factory defaults in order to partially protect themselves against destructive malware dubbed VPNFilter that has spread to an estimated 500,000 devices in 54 countries.
Cisco Talos Intelligence Group, which conducts broad industry research for vulnerabilities beyond just Cisco hardware, also called for ISPs who provide routers to customers to reboot the devices on customers' behalf and to work with Talos and other security professionals to update all devices when a patch is available.
Talos said VPNFilter has been found on routers manufactured by Linksys, MikroTik, NETGEAR and TP-Link, as well as NAS devices made by QNAP. No Cisco devices, or devices from other manufacturers, have been found to be infected yet.
However, in a lengthy blog post on the issue, Talos said it assesses with high confidence that its list of affected devices is incomplete. "Due to the potential for destructive action by the threat actor, we recommend out of an abundance of caution that these actions be taken for all SOHO or NAS devices, whether or not they are known to be affected by this threat," the post stated.
While they have been tracking the malware for a few months, Talos researchers accelerated public disclosure plans over concerns that efforts to spread the multistage modular malware platform had accelerated this month. The month of May brought port scans indicative of attempts to infect additional MikroTik and QNAP devices in more than 100 countries, further evidence of code overlap between VPNFilter and the BlackEnergy malware, which was identified in previous attacks against devices in Ukraine, and sharp spikes in infection activity, especially in Ukraine.
The precise exploitation route has not yet been identified, although Talos does not believe any zero-day flaws are involved. The company said known vulnerabilities in the affected devices provide sufficient avenues for infection.
The malware itself appears to be both sophisticated and versatile. Talos described it as having three stages. The first stage is designed to gain a foothold on the system and persists despite a reboot, meaning the current workaround cannot wipe out that portion of the threat. The first stage also features redundant mechanisms to connect to a command-and-control (C2) server.
That connection causes the infected device to download the Stage 2 malware, which does not persist after a reboot. Capabilities of the second stage include file collection, command execution, data exfiltration, device management and, in some versions, self destruction. The self-destruct capability is particularly nasty in that it overwrites part of the device firmware and then causes a reboot, making the device unusable.
"The destructive capability particularly concerns us. This shows that the actor is willing to burn users' devices to cover up their tracks, going much further than simply removing traces of the malware. If it suited their goals, this command could be executed on a broad scale, potentially rendering hundreds of thousands of devices unusable, disabling internet access for hundreds of thousands of victims worldwide or in a focused region where it suited the actor's purposes," the blog post stated.
Even versions of the Stage 2 malware without the self-destruct capability would be in danger in a mass-destruction attack, given the command execution capability of the base-level malware.
A third stage found in some devices consists of modules that plug into the Stage 2 malware. One module discovered so far includes a packet sniffer capable of stealing Web site credentials and monitoring Modbus SCADA protocols. Another module is designed to allow the Stage 2 malware to make connections over Tor.
The Talos blog post includes links to Snort rules for detecting VPNFilter and for protecting against known vulnerabilities in the affected devices, as well as anti-virus signatures for VPNFilter.
Posted by Scott Bekker on 05/23/2018 at 3:41 PM0 comments
A standard reaction to a private company sharing fairly specific financial results on a regular basis is to assume that they're trying to attract a buyer, but Veeam Software's co-founder insists that's not the case.
"What are the reasons for us to sell? We are fast-growing, we have a great market. There is not a single reason. And we don't have venture capitalists. They don't need the exits, they are not pushing us for the exits," said Ratmir Timashev, Veeam's co-founder and senior vice president for marketing and corporate development, during an executive roundtable at VeeamOn 2018 on Monday. The show runs through Wednesday in Chicago.
In late April, Veeam announced that the first quarter of 2018 was its 39th straight quarter of record bookings growth. The company claimed 21 percent growth year-over-year and said it was on track to become a $1 billion company (by revenues) in 2018. In a graphic shown to partners earlier in the day, Veeam told partners it was a $200 million company in 2012, was on track to hit $1.1 billion in 2018, and was aiming for $1.5 billion in 2020.
Sitting next to Timashev on stage, Co-CEO and President Peter McKay tackled the related question of whether Veeam was looking to go public.
"Why do you go public? You need cash. We don't need cash. You want some liquidity for investors? Our investors don't need liquidity," said McKay, adding that the company had "blinders on" and was focused on building the company.
Timashev took up the blinders comment and stretched the timetable. "Our goal is to build the company at least for the next 10 years -- dominate the multi-cloud and what's going to come after multi-cloud."
At VeeamOn, the company is rolling out a new branding to reach a larger total addressable market. The company is moving from its recent tagline of "availability for the always-on enterprise" to "intelligent data management for the hyper-available enterprise."
McKay walked partners through a simplified Veeam history in an earlier keynote, saying the company was about virtual machine backup from its founding in 2006, recovery in 2010, availability last year and hyper-availability now. He defined hyper-availability as the need to protect and make available data that is critical to the business, growing exponentially and sprawling across locations that include physical datacenters, virtual datacenters, public clouds and SaaS applications.
Timashev said the current battle in the industry is to dominate multi-cloud, that portion of the sprawling infrastructure that includes providing availability for services like Amazon Web Services (AWS), Microsoft Azure, Google Cloud and IBM Cloud.
"We won the first battle in the new modern datacenter. The next battle for the next five years is going to be multi-cloud," said Timashev.
A big component of Veeam's multi-cloud push came in January when it made its first acquisition in 10 years with the all-cash purchase of N2WS, a provider of cloud-native enterprise backup and recovery for AWS.
As validation of the potential of the larger multi-cloud market, the executives said the AWS administrator who buys an N2WS solution is different from the buyer of Veeam's other solutions.
"They're targeting a slightly different buyer, they are targeting the AWS buyer," Timashev of N2WS. "If we are selling to a centralized IT person, N2WS technology will be part of our solution. In a couple of years, this AWS admin will become part of the central IT.
McKay said the difference was confirmed for Veeam at an N2WS booth at a recent cloud show. "We're hearing the name of all of our customers, and they didn't know who we were. This cloud world is a different buyer," McKay said.
With Veeam not looking to sell, McKay addressed whether the company was looking to buy other companies in the newly strategic areas of backup and recovery software for Azure, Google, IBM Cloud and for key SaaS applications. He suggested that while the next acquisition might not take another 10 years, the company also wouldn't be snapping up a lot of companies at once.
"We zero in on our market and what we do. If we're going to expand, we're going to expand at the right time, in a digestible fashion," McKay said, adding that the company wants to strike the right balance. "We feel like [we have the] best engineering, R&D, but there's a time to market, there's a time to buy."
Pictured: Veeam President Peter McKay (left) and Veeam Co-Founder Ratmir Timashev.
Posted by Scott Bekker on 05/15/2018 at 9:51 AM0 comments
Microsoft Build 2018, the company's flagship developer conference, wrapped up this week in Seattle with a lot of news. With dozens of announcements, including 70 new capabilities in Azure and more than 100 new features for the Bot Framework, it's impossible to capture even all of the important ones. What follows are 11 key moments from the three main keynotes that highlight important themes of the show.
1. Microsoft 365 Gets Promoted
It's been clear for a long time that Microsoft Azure is the key strategic umbrella platform for Microsoft. The question is, how does Microsoft prioritize and organize the rest of its products? At least for Build, Microsoft CEO Satya Nadella made that clear.
"We're focused on two massive platform opportunities. One, Microsoft Azure, the other Microsoft 365," Nadella said in his kickoff keynote.
The designation of Microsoft 365 up alongside Azure is a significant elevation or promotion for Microsoft 365, which previously seemed more like a licensing construct rather than an organizing principle for Windows 10, Office 365 and Enterprise Mobility and Security (EMS) services.
Joe Belfiore, corporate vice president for Windows at Microsoft, expanded on the architectural vision around Microsoft 365 in his Day 2 keynote. "We want Microsoft 365 to embrace multiple devices. We want it to be smart about letting users move from a PC to a phone. We want Microsoft 365 to embrace multi-sense use. We want it to be able to fluidly go from mouse and keyboard to touch to ink. We want to be embracing vision, new ways of working like wearing a VR display, and so a lot of these ideas are not only work that we're doing in the products that we build, but we're also trying to platformize so the code that all of you build will make your organizations more effective in the same kind of way," Belfiore said.
Short version, pay attention to Microsoft 365 as a platform.
2. Azure is Accelerating
Microsoft's big ambitions for Azure itself are expanding. Nadella laid out Microsoft's grandiose plans rather bluntly.
"Azure is being built as the world's computer," Nadella said.
He marshalled a fair amount of evidence for the assertions around Azure's scale:
- There are more than 50 regions
- The Azure cloud counts 70-plus certifications worldwide
- About 70 new Azure capabilities were launched during Build
Nadella also enumerated all the ways the platform is expanding beyond the core public cloud service: "As computing spreads, as there is need for computing at the edge, we are building out Azure, Azure Stack, Azure IoT Edge and Azure Sphere as this one computing fabric that supports this new application model."
That intelligent edge emerged as an important theme at Build, and Microsoft also spent some time clarifying use cases for Azure Stack, the private cloud version of Azure that a company can run in its own datacenter disconnected from the public cloud. Azure Stack is not just, or even primarily, about organizations that distrust public cloud networks. Instead, Microsoft is positioning Azure Stack as a prime example of the intelligent edge.
"Azure Stack, which is just a year old, is supporting multiple scenarios. For example, Chevron is using it so that they can essentially have Azure in a disconnected way at their oil rigs. A bank in South Africa, ABSA, is using it for their regulatory workloads, as well as using the public cloud, Azure. And then Schlumberger is actually doing distributed computing. So they use the public cloud, as well as Azure Stack, as one fabric to be able to distribute compute so that it's close to where the data is," Nadella said.
3. Commoditizing AI
Closely linked to Azure is artificial intelligence, and AI was a huge theme at Build.
Nadella reeled off a number of milestones Microsoft had achieved comparing machine performance versus human performance on various tasks, but then put them in a perspective that demonstrated Microsoft's AI philosophy.
"Who cares about breakthroughs we achieve? What matters is can we translate these into frameworks, tools and services, and put them in your hands as developers so that you can take AI and have impact in every industry in every application. That's what's important. We truly are committed to, in some sense, commoditizing AI," Nadella said.
Among the more specific AI announcements during Build week, one significant enhancement was a preview of Project Brainwave, a distributed, real-time AI fabric that currently works with FPGAs from Intel.
4. 200 Million Corporate Users of Windows 10
Belfiore shared Windows 10 deployment momentum numbers that show Microsoft's flagship client OS gaining traction among business users.
"We see Windows 10 deployment really ramping up significantly in commercial accounts. Right now there are over 200 million people in corporate accounts using Windows 10 and we've seen that adoption rate increase now at 79 percent year-over-year growth," Belfiore said.
5. Bring Out the Drones
A big theme at Build was enabling developers to create solutions combining Internet of Things (IoT), Azure cloud, AI, cameras and drones.
Sam George, director of Azure IoT Engineering and Program Management at Microsoft, oversaw a keynote demo of a drone visually inspecting pipes for damage.
"For the first time ever, we're able to stream video back from that drone to this laptop running IoT Edge and our AI model, which was developed in the cloud," George said.
Major related news at the show included camera partnerships with Qualcomm, a drone-related partnership with DJI and the revival of Kinect, Microsoft's groundbreaking motion sensor for gaming, as an IoT device.
6. New Tricks for Cortana
The Microsoft digital assistant Cortana was a star of the show, most visibly in an on-stage demo with Amazon Alexa. The two assistants called upon one another to handle tasks in their specialty areas, with Alexa focusing on home and consumer jobs and Cortana handling work schedules and e-mails.
Tom Taylor, a senior vice president for Amazon Alexa, had one of the best applause lines of the show when he asked Alexa: "What do you think about Cortana?" Alexa replied: "I like Cortana. We both have experience with light rings, although hers is more of a halo."
A significant theme around Cortana was the emergence of the personification of AI as Cortana-based assistance versus being an assistant.
In that vein and around the broader enablement of chats and bots, Nadella said, "At this conference, we're launching 100-plus new features for the Bot Framework so that you can continue to build these conversational interfaces and give them more of the customization. So, for example, you can have a custom wake word, you can give it custom speech, you can even give it custom personality, take some of the FAQs and turn them into Q&A. And then take the corpus of data you have in terms of conversations and use that as label data to have a full dialogue system."
7. Open Source Love Continues
Microsoft can't get developers together anymore without professing its undying commitment to open source in some way.
This time it took the form of a testimonial from Jason Warner, the senior vice president of technology at GitHub, who took the keynote stage to talk about the scale of Microsoft's efforts on GitHub.
"I think it's amazing to see what Microsoft has done in the past few years," Warner said. "The industry has shifted, and they realize the power of open source. And, in fact, I don't think it's too bold to say that open source now powers modern software development. And Microsoft might be the best example of a corporation embracing open source. We know from statistics that we have in GitHub that Microsoft is the single largest corporate contributor to open source on GitHub, and there by extension, in the history of open source. In fact, Microsoft has the largest open source community in the entire world with Visual Studio Code."
8. Expanding the Container Story
Microsoft used Build to disclose a further push into the world of containers, with the announcement of the Azure Kubernetes Service (AKS).
"Our new AKS or Azure Kubernetes Service, provides a fully managed Kubernetes-based orchestration service. It provides built-in auto-patching, auto-scaling and update support, which enables you to take the full-breadth of the Kubernetes ecosystem when you're doing your development," Guthrie said.
9. Faster and Potentially Cheaper Cosmos DB
Microsoft's Cosmos DB has been growing by leaps and bounds. During Microsoft's last financial earnings call, Nadella told investors that the product was on a $100 million annualized revenue pace, making it the fastest-growing database product he'd seen out of the company.
Cosmos DB is a globally distributed database that is a core service of all Azure datacenters, allowing customers to scale geographically, and has support for multiple models, including document, graph, key-value, table and column-family, with APIs for SQL, MongoDB, Cassandra, Gremlin and Table.
At Build, Microsoft unveiled several improvements to Cosmos DB, including a new pricing option that Guthrie said could lead to major cost savings, and, on the technical side, a feature called multi-master write support.
"Cosmos DB now supports unlimited read and write scalability by virtue of a highly decentralized master-less replication protocol support," Guthrie said. "This guarantees single-digit millisecond reads and now writes response time at the 99th percentile anywhere in the world, which is something no other database in the world delivers today."
10. Get To Know the Graph
Alongside the rise of Microsoft 365 comes the rise of an underlying technology called the Microsoft Graph.
Belfiore called the Graph one of the most important things that Build attendees should understand for his Microsoft 365-focused Day 2 keynote.
"I just want you to come into the keynote with one key idea, which is that the Graph is a cloud-backed data store where both you and us can put organizational data in a way that's private and secure to the organization, but in a way that also lets our solutions take advantage of both so that AI can reason against that data, that organizational data, and so that the experiences that both we build and you build can light up and make those end users' lives better," Belfiore said.
Many of the announcements and demos during the keynote related to the Graph. Notable ones included Timeline, a technology that lets users scroll back through their history of activities with the ability to click on entries to pick up where they left off. A related concept was Sets, which could also be accessed through the Timeline, but which allow users to bring up the entire context of a project, such as a Word document with the associated browser tabs that were being used in researching the document.
11. Laying Down a Responsibility Marker
In an era when Facebook executives are testifying before Congress and cable news shows feature constant discussion about technology's role in democratic elections, Nadella laid down a marker on responsibility.
"We...have a responsibility as a tech industry to build trust in technology," Nadella said right at the start of his Build keynote. Echoing recent comments from Microsoft President Brad Smith, Nadella said Microsoft is focused on three core pillars: privacy, cybersecurity and ethical AI.
Posted by Scott Bekker on 05/11/2018 at 10:43 AM0 comments
"Graph" is one of those terms that Microsoft has been throwing around for a few years now, but that can be difficult to define.
There was the Office Graph, the current Microsoft Graph, the LinkedIn Graph and other graphs. What is Microsoft talking about?
Joe Belfiore, corporate vice president for Windows at Microsoft, made an effort to break it down Tuesday morning in his Microsoft Build 2018 keynote.
Calling the graph a "key idea," Belfiore stepped back and described it like this for the developer audience at the Seattle show:
"The graph is a cloud-backed data store where both you and us can put organizational data in a way that's private and secure to the organization, but in a way that also lets our solutions take advantage of both. So that AI can reason against that data, that organizational data, and so that the experiences that both we build and you build can light up and make those end users' lives better."
That's not a bad starting point.
More from Build 2018:
Posted by Scott Bekker on 05/08/2018 at 3:00 PM0 comments
Five months after they were supposed to be working together, Cortana and Alexa shared a stage on Monday for a joint demo.
Cortana, Microsoft's intelligent assistant that operates primarily from Windows 10, and Alexa, Amazon's assistant whose main platform is Echo devices, were featured calling upon one another's services during the opening keynote of the Microsoft Build 2018 show in Seattle.
It was an integration that Microsoft and Amazon in a joint statement had said they would deliver by the end of 2017. That deadline came and went with no timeline updates from either company.
Microsoft CEO Satya Nadella introduced the demo on Monday during his keynote by stressing how important it is for personal digital assistants to communicate.
"We want to make it possible for our customers to be able to get the most of that personal digital assistant, not be bound to some single walled garden, and for developers to have access to the maximum number of users," Nadella said. "We've been working with our friends across the lake at Amazon to really bring Alexa and Cortana together to benefit every user and every developer out there."
For the demo, Megan Saunders, a general manager on the Microsoft Cortana team, and Tom Taylor, a senior vice president for Amazon Alexa, went to opposite ends of the stage to interact with each other's environments.
Saunders pretending to be in her kitchen with an Amazon Echo, first asked Alexa to add milk to her shopping list, then made the key request of the tube-shaped speaker appliance: "Alexa, open Cortana." After a pause, Cortana's voice asserted: "Cortana here, how can I help?" From there, Saunders got her appointments for the day, including a dinner with Amazon's Taylor to celebrate the demo, and used voice commands to send Taylor an e-mail that she'd see him tonight.
Taylor, pretending to be in his office working on a Windows 10 laptop, read the e-mail from Saunders, and first asked Cortana to show him the location of the restaurant where he was meeting Saunders. Then, he said the key phrase from the other side: "Hey, Cortana, open Alexa." Soon, the PC was saying, "Hi there, this is Alexa, how can I help?" He then asked Alexa to get him an Uber to the restaurant, and got Alexa to turn off a lamp on his desk.
For a laugh line, Taylor closed by asking Alexa, "What do you think about Cortana?" The response: "I like Cortana. We both have experience with light rings. Although hers is more of a halo."
The integration is currently in a limited beta. Microsoft created a Web site for users who want to be notified when the integration is live here.
Posted by Scott Bekker on 05/07/2018 at 2:59 PM0 comments
Microsoft CEO Satya Nadella on Monday unveiled a fourth-generation version of the company's discontinued Kinect motion-sensing device, which was originally designed for gaming but is being repurposed with more advanced technologies for artificial intelligence, Azure, edge computing and the Internet of Things (IoT).
Project Kinect for Azure is a package of sensors anchored by a next-generation depth camera that also includes on-board processors. An availability timeframe for the sensor package was unclear, with Microsoft saying more details would be coming over the next few months.
"This Project Kinect for Azure is going to have some of the best spatial understanding, skeletal tracking, object recognition, and package some of the most powerful sensors together with the least amount of...noise and also have ultra-wide field of view," Nadella said in his keynote at the Microsoft Build 2018 conference in Seattle.
Kinect was originally launched with Xbox 360 in 2010 with the idea of creating a new category of games that would be controlled strictly by players' motions. The technology did inspire a few titles, but didn't take off as a runaway category. Microsoft released a PC version later that could be used both for gaming and for potential business applications. The company released updated versions of Kinect for Xbox One and for PC before discontinuing all versions of the product last year.
Nadella said Microsoft was inspired by partners' business applications with the PC versions in medical, industrial, robotics and education applications, and he suggested that subsequent technological progress make it the right time for another run at the technology.
"Since Kinect, we've made a tremendous amount of progress when it comes to the foundational technologies, in HoloLens," Nadella said in reference to Microsoft's augmented reality headset, which the company describes as a holographic computer. "We're taking those advances and packaging them up as Project Kinect for Azure. This set of sensors we expect to be fully integrated into many different applications both on the consumer side, as well as the industrial side."
Alex Kipman, technical fellow for AI Perception and Mixed Reality, and the public face of Microsoft's HoloLens efforts, called the forthcoming version of Kinect "a key advance in the evolution of the intelligent edge; the ability for devices to perceive the people, places and things around them."
In a LinkedIn post Monday, Kipman provided details about the depth sensor that will be part of Project Kinect for Azure and that also will be included in the next generation of HoloLens. Features include 1024x1024 pixel resolution, low power consumption, the ability to cleanly capture near and far objects, and a shutter that improves performance in sunlight.
Kipman also called Project Kinect for Azure a fourth generation of the technology because, in addition to the first-generation Xbox 360 and second-generation PC versions, Kipman said the third generation of the underlying technology helped power the first HoloLens product.
"With Project Kinect for Azure, the fourth generation of Kinect now integrates with our intelligent cloud and intelligent edge platform, extending that same innovation opportunity to our developer community," Kipman said.
Posted by Scott Bekker on 05/07/2018 at 3:00 PM0 comments
Twitter made a costly admission on Thursday afternoon. The company's stock took an after-hours hit as investors digested a company Tweet and blog post revealing that Twitter had discovered an internal bug that resulted in user passwords being stored unencrypted on an internal log.
Twitter CTO Parag Agrawal encouraged the service's 330 million users to consider changing their Twitter passwords on all services where they've used it. Agrawal said the move came "out of an abundance of caution" and emphasized that Twitter has no reason to believe the passwords ever left company systems or that they were misused.
In other words, this wasn't a breach, and you're not about to get an alert from haveibeenpwned.com.
"We mask passwords through a process called hashing using a function known as bcrypt, which replaces the actual password with a random set of numbers and letters that are stored in Twitter's system. This allows our systems to validate your account credentials without revealing your password," Agrawal wrote.
"Due to a bug, passwords were written to an internal log before completing the hashing process. We found this error ourselves, removed the passwords, and are implementing plans to prevent this bug from happening again," he said.
Judging by the current public information, Twitter is handling this the right way. Changing Twitter passwords on all of our devices will be a pain -- who enjoys typing a secure password into a smartphone, after all? It's worth the annoyance.
Everybody makes configuration mistakes. Assuming that's all this is, Twitter might have been able to get away with hiding an internal flub like this that hasn't resulted in an actual known breach of passwords.
Getting the word out is respectful of the user base. It also protects Twitter in our current advanced persistent threat environment. If it turns out later that some APT was inside Twitter's systems unbeknownst to the company, the rest of us will have had a fair opportunity to secure our accounts.
So go change those passwords.
Posted by Scott Bekker on 05/03/2018 at 3:25 PM0 comments
Satya Nadella says he's never seen a database scale as quickly as Cosmos DB.
"In less than a year, Azure Cosmos DB, the first globally distributed [and] multi-model database, exceeded $100 million in annualized revenue," Microsoft's CEO told investors last week on the company's Q3 earnings call. While annualized revenue doesn't mean Microsoft has pulled down $100 million on Cosmos DB yet, it does mean that it's recently ramped up to a pace of more than $8.3 million a month.
"I've been around databases for a long time. I've never seen a product that's gotten to this kind of scale this quickly," Nadella said, according to the Seeking Alpha transcript of the call.
Microsoft launched Cosmos DB as an Azure service at its Build developer conference last May.
Replacing Microsoft's previous NoSQL offering DocumentDB, and competitive against Amazon Web Services (AWS) DynamoDB and Google Spanner, Cosmos DB boasts several key selling points.
As a base service of every Azure datacenter and region, Cosmos DB is globally available and allows developers to click on a map to add or delete geographic regions even while their Cosmos DB-based applications are running. That worldwide distribution allows for both massive scale and minimal latency for users anywhere.
Cosmos DB includes support for multiple data models, including document, graph, key-value, table and column-family, and has APIs for SQL, MongoDB, Cassandra, Gremlin and Table.
Microsoft also stands behind Cosmos DB with guarantees, including a four-nines uptime service-level agreement, end-to-end latency times in the low millisecond range and zero-data loss in the case of regional failures.
During the investor call, Nadella positioned Cosmos DB as a potential cornerstone of future data projects on Azure, especially as artificial intelligence efforts drive demand for more and more data. "This AI era is mostly first a data era. And that's where I think the opportunity lies," Nadella said. "Cosmos DB happens to be one of the best database products to be able to capture the signals that you want around your customers from a variety of different sources."
How many customers would be required to drive more than $100 million in annualized revenue is not clear. From the earliest discussions at Build last year, it was evident that even a few customers can drive massive data volumes. At that show, Microsoft Executive Vice President of Cloud and Enterprise Scott Guthrie said early adopters, including Jet.com, were already using Cosmos DB to the tune of 100 trillion transactions per day.
Andrew Brust, founder and CEO of Blue Badge Insights and a co-chair of the 1105 Media Visual Studio Live! conference series, is seeing broad interest in Cosmos DB in the database community.
"My own observation of Microsoft tech influencers who have years -- or decades -- of experience with conventional database technology like SQL Server (and therefore Azure SQL Database), is that they are kicking the tires on Cosmos DB and still working to understand how its pricing will work for them in production," Brust said in an e-mail interview.
"The use case for massive, global-scale Web properties employing a database like Cosmos DB is pretty clear. For the long tail of enterprise applications with smaller, less concurrent and less geographically dispersed user bases, the adoption of Cosmos DB is more disruptive and less straightforward," Brust said. "This will likely get better with time, and the pricing models for Cosmos DB will likely become more compelling and more easily understood, too. Amazon's DynamoDB has been out for many years and is explicitly integrated with a number of other AWS services so, clearly, Cosmos DB isn't on a level playing field with it yet. This, too, will change over time."
Brust says a key point to remember about Cosmos DB is that it's based on the same technology Microsoft has used for its own cloud services for years.
"The company's skin is in the proverbial game, with very high stakes. If it's worked for them, it's going to work really well for their customers. What's left is more fit and finish on pricing, marketing and rationalizing the service with other components in the Azure data and analytics stack," Brust said.
Posted by Scott Bekker on 04/30/2018 at 9:55 AM0 comments
Microsoft reported earnings on Thursday of $0.95 per share on revenues of $26.82 billion.
The third-quarter (January through March) figures beat analyst expectations of $0.85 per share and revenues of $25.77 billion, but the company's stock still fell in after-hours trading following the news.
Revenue was up 16% and earnings per share (EPS) climbed 36%, although the EPS figure excluded certain items.
The results included surprisingly strong Windows and Surface results in a largely flat-to-declining PC market, but seemed mainly powered by continuing cloud strength.
By broad business unit, Microsoft did $9 billion in revenues in Productivity and Business Processes, $7.9 billion in Intelligent Cloud, and $9.9 billion in More Personal Computing.
The 13% revenue gain in More Personal Computing included a 21% jump in Windows commercial products and cloud services revenue, and a 32% jump in Surface revenue. Microsoft attributed the Windows gain to an increased volume of multi-year agreements and to internal accounting reasons, with the mix of products sold carrying higher in-quarter revenue recognition than in the year-ago period.
As for Surface, Microsoft said the company's line of Microsoft-branded PCs had a favorable comparison against a year-ago period impacted by end-of-lifecycle dynamics.
Amy Hood, executive vice president and chief financial officer at Microsoft, said in the earnings news release that the company's performance across all segments was better than expected. "We delivered double-digit revenue and operating income growth driven by 58% growth in our commercial cloud revenue."
Microsoft CEO Satya Nadella chose to interpret the results as evidence of customer trust in the Microsoft cloud. "We are innovating across key growth categories of infrastructure, AI, productivity and business applications to deliver differentiated value to customers," Nadella said in a statement.
Revenues for the Productivity and Business Processes unit and the Intelligent Cloud unit, both of which include cloud products, were each up 17%. Those gains included growth of 42% in Office 365 commercial, 65% in Dynamics 365 and 93% in Azure.
Posted by Scott Bekker on 04/26/2018 at 2:21 PM0 comments
Microsoft outlined a major new security vision this week called Azure Sphere that aims to secure the billions of devices on the Internet of Things (IoT) from device hardware to software to cloud, and gives Microsoft a central role.
Brad Smith, Microsoft president and chief legal officer, announced the initiative on Monday during a security news briefing in San Francisco timed to coincide with the start of the 2018 RSA Conference.
"What we're announcing today is Azure Sphere. It is an end-to-end IoT solution. It goes where...no company has gone before," Smith said.
The Azure Sphere solution has three parts: Azure Sphere MCUs, the Azure Sphere OS and the Azure Sphere Security Service.
Azure Sphere MCUs: The first part is a microcontroller unit (MCU), the chips that power IoT devices. Microsoft has developed a new class of MCUs, which it also calls the Azure Sphere MCU or Azure Sphere chip. Microsoft plans to license the intellectual property of the new MCUs royalty-free for silicon partners interested in developing and manufacturing Azure Sphere chips. A major element of the chips is the Microsoft Pluton Security Subsystem for creating a hardware root of trust, storing private keys and executing cryptographic operations. Other elements of the chips include network connectivity, Microsoft I/O firewalls, an application processor, a real-time processor, flash memory, SRAM and multi-plexed IO, according to a diagram.
Azure Sphere OS: The second part is an operating system for IoT devices built on a Linux kernel, the first time Microsoft has released an OS built on Linux. According to Microsoft, the Azure Sphere OS will offer a trustworthy, defense-in-depth platform via secured application containers and a security monitor.
Azure Sphere Security Service: The cloud component is the Azure Sphere Security Service, which Microsoft describes as a turnkey cloud security service. Elements include certificate-based authentication for all communication, device authenticity checks, device status and health monitoring, automated updates of the Azure Sphere OS, and device software deployment services. The security protections through the service are designed to last for a 10-year device lifetime.
Currently, Azure Sphere is in a private preview, and Microsoft is working with select hardware providers. The first Azure Sphere chip is being developed by MediaTek Inc., which built the MT3620 as a reference architecture for Azure Sphere with Microsoft and is now sampling the chip with some customers. The company expects broad public availability for the MT3620 in the third quarter of this year.
"MediaTek has a long history of working with Microsoft on specific SoC [system on a chip] designs that meet demanding connectivity needs," said Jerry Yu, MediaTek corporate vice president and general manager of the Intelligent Devices Business Group, in a statement Tuesday. "On top of our close ties with Microsoft and design expertise, Microsoft had a vision we also believed in."
According to a blog by Galen Hunt, partner managing director at Microsoft for Azure Sphere, a first wave of Azure Sphere devices will be "on shelves" by the end of 2018. He also promised universally available dev kits by mid-2018.
Arm Ltd. was also another early partner, working closely with Microsoft to incorporate its Cortex-A application processors into Azure Sphere MCUs, according to a Microsoft page detailing the Azure Sphere silicon ecosystem. Other partners represented on that page include Hilscher, LitePoint, LongSys, Nordic, Nuvoton, NXP, Qualcomm, Seeed Studio, Silicon Labs, ST Micro, Toshiba and VeriSilicon.
During the briefing, Smith suggested why Microsoft thinks the time is right to roll out a significant IoT security initiative.
"There are going to be 9 billion of these MCU-based devices shipped this year. Think about that. For every person on the planet, there will be more than one of these MCU devices shipped. They literally will be in the toys of our children, they literally will be in our kitchens and our refrigerators, they will be in every room in our house," Smith said. "Today, fewer than 1 percent of those MCUs are connected to a network or the Internet. But that is changing, and it's going to continue to change. And what it fundamentally means is that our homes and our offices and the infrastructure of the future will literally be only as secure as the weakest link."
Smith also cited the Mirai botnet as a harbinger of the types of security threats that will become more common as IoT expands, and as a reason that a holistic security approach is needed.
"It was in 2016 that the Mirai attack basically enabled hackers to take control of 100,000 devices and use it to launch a DDoS attack by turning those devices into part of a botnet. It was an attack that, on a single day, basically took the East Coast of the United States off of the Internet," he said, reinforcing an idea that he discussed earlier in his talk and in a related blog post. The idea is that Microsoft and others in the tech sector have the first responsibility to address security issues.
"We operate the platform. We unfortunately are the battlefield in many ways," he said.
Posted by Scott Bekker on 04/17/2018 at 10:07 AM0 comments
Microsoft hearts Linux and all, but the company is reaching a new level.
In a slew of security news this week, Microsoft unveiled an operating system product -- not an internal system, but an operating system product -- that it will release with a Linux kernel.
The product is Azure Sphere OS, and it's part of Microsoft's ambitious effort to place itself at the center of the emerging swarm of Internet of Things (IoT) with Azure Sphere, a combination of a reference architecture for microcontroller units (MCUs), operating systems for the devices themselves, and a cloud-based Azure Sphere Security Service to manage and secure them all. Go here for a detailed look at the broader Azure Sphere initiative, which is expected to result in shipping products by the end of the year.
In announcing Azure Sphere during a security news briefing on Monday, Microsoft President and Chief Legal Officer Brad Smith took a moment to acknowledge the significance of the Linux component.
"For anybody who has been following Microsoft, I'm sure you'll recognize that after 43 years, this is the first day that we're announcing that we'll be distributing a custom Linux kernel," Smith said. "It's an important step for us, it's an important step I think for the industry, and it will enable us to stand behind the technology the way I believe the world needs, because what we will do is ensure that these devices are secured throughout their 10-year lifetime with the continuing improvements and updating to the Azure Sphere operating system."
Microsoft describes the Azure Sphere OS as a trustworthy, defense-in-depth operating system. The OS has five layers, with OS Layer 0 interacting with the hardware, OS Layer 1 running a security monitor, OS Layer 2 hosting the custom Linux kernel, OS Layer 3 covering on-chip connectivity services, and OS Layer 4 sporting app containers for compute and real-time I/O.
Microsoft is not conceding defeat here to Linux, with which Windows has fought for decades, but more of a tactical cooperation with the open source community that Microsoft has increasingly worked with for the last several years.
Microsoft is still sprinkling the operating system with Windows features, and recognizing that Linux has a more efficient kernel for the limited devices of IoT.
"This is a new operating system. It's based on a custom Linux kernel -- a custom Linux kernel that has really been optimized for an IoT environment and is reworked with security innovations pioneered in Windows," Smith said. "Of course, we are a Windows company, but what we've recognized is the best solution for a computer of this size in a toy is not a full-blown version of Windows. It is what we are creating here. It is a custom Linux kernel, complemented by the kinds of advances that we have created in Windows itself."
Even with those caveats, this is a significant step for Microsoft. This is a company that always saw Windows as the answer to any operating system question -- from Windows Datacenter Server in the largest use case to the recent Windows IoT Core for the very smallest.
The new days at Microsoft just keep on coming.
Posted by Scott Bekker on 04/17/2018 at 10:45 AM0 comments
Gartner and IDC this week both released their reports on the worldwide PC market. It's a tale of two markets -- with neither of the stories being very happy.
Let's start with the better news. That would be from IDC, which found evidence of a flat market. That's right, this was the good news. IDC reported that worldwide there were some 60.4 million PCs sold in the January-to-March period.
That amounted to 0.0 percent growth over the year-ago quarter. The reason that's good news is that IDC had previously forecast a drop of 1.5 percent, so flat is better than declining.
IDC also found some green shoots related to Windows 10. In its discussion of the quarter, IDC noted that businesses are moving to Windows 10 at a steady clip.
Speaking of the U.S. market, Neha Mahajan, senior research analyst for Devices & Displays at IDC, stated, "The year kicked off with optimism returning to the U.S. PC market, especially on the notebook side. A likely rise in commercial activity amidst a positive economic environment is expected to further strengthen demand."
Overall, Jay Chou, research manager of IDC's Personal Computing Device Tracker, called the path that PCs are on "resilient" and predicted "modest commercial momentum through 2020."
Even that modest optimism was not evident in an assessment released on the same day by Gartner. Gartner, while calling the market as slightly larger at 61.7 million unit shipments for the quarter, reported a 1.4 percent decline in PC shipments for Q1.
Gartner Principal Analyst Mikako Kitagawa affixed the blame primarily to the Chinese market. "The major contributor to the decline came from China, where unit shipments declined 5.7 percent year over year," Kitagawa said in a statement. "This was driven by China's business market, where some state-owned and large enterprises postponed new purchases or upgrades, awaiting new policies and officials' reassignments after the session of the National People's Congress in early March."
Where IDC saw some modest improvements in the U.S. market, Gartner found red ink there, too, reporting a 2.9 percent decline in U.S. PC shipments from Q1 2017 to Q1 2018. In all, Gartner declared Q1 2018 the 14th consecutive quarter of decline going all the way back to the second quarter of 2012.
Posted by Scott Bekker on 04/13/2018 at 1:10 PM0 comments
Adam Kujawa, director of Malwarebytes Labs, has a strong reaction to the amount of cryptomining malware his company saw in the first quarter of 2018.
"Cryptomining has just gone insane," Kujawa said in an interview about Malwarebytes Labs' new security report covering the January-to-March period. "It's all over the place. We've never seen a mass migration to the use of one particular type of threat so fast by so much of the cybercrime community as we have seen with cryptominers."
Malwarebytes on Monday released "Cybercrime tactics and techniques: Q1 2018," the latest in its quarterly series of reports based on telemetry from its business and consumer products.
There are legitimate miners that get a user's consent before repurposing all or most of their CPU capacity toward mining for cryptocurrencies. Malwarebytes' report focuses on the other kinds -- malware-based miners that are often delivered via existing malware families and browser-based miners that hijack a victim's processor through drive-by attacks or malicious browser extensions.
The company found that cryptomining detections were way up in the quarter for consumers, with Android miners in particular surging to 40 times more detections this quarter than last. There was also a boom in March in Mac-based detections of malware-based miners, browser extensions and cryptomining apps, the company found.
For now, it's mainly a consumer problem. Business customers saw a 27 percent increase in cryptomining -- a significant jump to be sure, but nowhere near the levels on the consumer side.
This security report is a trailing indicator given that it covers the first three months of the year. Yet the cryptomining spike documented by Malwarebytes is tracking a little behind the price movement on the flagship cryptocurrency, Bitcoin, which had a recent peak in December but has been mostly falling from those highs over the last quarter.
Damages from cryptomining are squishy for businesses to calculate. A drive-by, browser-based attack, for example, can sometimes be stopped by simply shutting down the offending tab. Other types of cryptomining malware can be much more insidious.
How much damage is really done? There's lost productivity for sure, but Kujawa argues the malware delivery vectors that brought the cryptomining malware to systems will represent a lasting problem, even if cryptocurrency values don't rebound quickly and attackers lose interest in the attacks.
"A miner may only cause minimal damage, but any infection that you don't want to be on your system can install different stuff," he said. "The attacker sends a message to the miner: 'Hey install some ransomware for me, worm, go back to the old tricks.' It's like keeping your back door unlocked."
Posted by Scott Bekker on 04/09/2018 at 1:42 PM0 comments
Even though Microsoft has lobbied consistently for the CLOUD Act, the speed with which the federal legislation went from bill to presidential signature took even the technology giant by surprise.
In a lengthy blog post Tuesday, Microsoft President and Chief Legal Officer Brad Smith admitted that passage of the CLOUD Act, which stands for Clarifying Lawful Overseas Use of Data Act, on March 23 was a "bit of a shock."
Congress slipped the CLOUD Act into a 2,000-plus-page omnibus bill that President Donald Trump signed after a brief show of protest. Trump had tweeted that he might veto the bill, although his objections to the $1.3 trillion bill that narrowly averted a government shutdown involved other aspects of the legislation, such as the level of border-wall spending and a lack of action on DACA. The CLOUD Act portion did not come up in most news coverage of the omnibus bill.
Microsoft's advocacy of the CLOUD Act even extended to urging the Supreme Court during oral arguments in the Microsoft warrant case in late February to wait for Congress to pass it, or similar legislation.
"This Court's job is to defer, to defer to Congress to take the path that is least likely to create international tensions. And if you try to tinker with this, without the tools that -- that only Congress has, you are as likely to break the cloud as you are to fix it," said lawyer E. Joshua Roskenkranz in Microsoft's closing statement in the case, which involved U.S. law enforcement efforts to obtain customer data stored by Microsoft in a datacenter in Ireland.
For his part, Michael R. Dreeben, deputy solicitor general for the U.S. Department of Justice, was probably shocked by the timing, as well. Dreeben had argued that the high court, which is expected to rule on the case in June, shouldn't wait on a law that didn't appear at the time to have a clear path to passage. "As to the question about the CLOUD Act, as it's called, it has been introduced. It's not been marked up by any committee. It has not been voted on by any committee. And it certainly has not yet been enacted into law," Dreeben said just a month before the act passed.
While the effect of the new law on the high court's ruling is hard to predict, Microsoft's Smith blogged this week that this update to the legal code written in the context of the existence of cloud computing will help U.S. cloud providers like Microsoft balance the requirements of cooperating with legitimate law enforcement requests while protecting the privacy rights of international customers.
The road forward from the passage of the law until it starts yielding evidence for U.S.-based law enforcement efforts could be somewhat long. The act calls on the executive branch to establish reciprocal international agreements allowing law enforcement in both countries to access data in each other's countries. Yet as a first step, the administration must also establish that each country with which it creates an agreement protects privacy and human rights. Congress also has 180 days to review the agreements.
Smith's interpretation is that the law leaves room for cloud providers to challenge law enforcement requests during the interim period. "The CLOUD Act both creates the foundation for a new generation of international agreements and preserves rights of cloud service providers like Microsoft to protect privacy rights until such agreements are in place," Smith said.
Unstated in Smith's blog entry is the sigh of relief. Right now, U.S.-based technology companies dominate the global cloud computing infrastructure market. But there is no iron law that this state of affairs must continue. The Edward Snowden revelations of 2013 marked a huge challenge to international businesses' and governments' trust in U.S.-based companies ability and willingness to protect their data from the U.S. government. Microsoft, Google and Amazon have been looking over their shoulders for potential new international competitors and contemplating a potentially fragmented global market where U.S.-based cloud providers could be shut out of some countries over data sovereignty and citizen privacy concerns.
Smith laid out that line of thinking in a February post about the Supreme Court case. "U.S. companies are leaders in cloud computing. This leadership is based on trust. If customers around the world believe that the U.S. Government has the power to unilaterally reach in to datacenters operated by American companies, without reference or notification to their own government, they won't trust this technology," Smith wrote.
The passage of the CLOUD Act gives Microsoft a much stronger privacy story for international customers and an opportunity, along with other U.S.-based cloud providers, to continue leading the global charge for cloud computing.
Posted by Scott Bekker on 04/04/2018 at 3:48 PM0 comments
A new training module in Microsoft's Professional Program gives tens of thousands of people a chance to brush up their AI skills.
Announced Monday, the Microsoft Professional Program for Artificial Intelligence will consist of 10 parts, each of which is supposed to take eight to 16 hours to complete. Attendees can either audit the courses or pay in order to get a certificate of completion.
Microsoft framed the program as a massive online open course (MOOC) that grew out of Microsoft's internal AI training initiatives, including one project-based, semester-style program called AI School 611.
"The program provides job-ready skills and real-world experience to engineers and others who are looking to improve their skills in AI and data science through a series of online courses that feature hands-on labs and expert instructors," Microsoft noted in the description of the new Microsoft Worldwide Learning Group program.
The nine courses include an intro to AI, using Python to work with data, using math and statistics techniques, considering ethics for AI, planning and conducting a data study, building machine learning tools, building reinforcement learning models, and developing applied AI solutions. The applied AI section has three options -- natural-language processing, speech-recognition systems, or computer vision and image analysis.
The track ends with a final project called the Microsoft Professional Capstone: Artificial Intelligence. Details of the capstone project are coming soon, according to Microsoft's Web site explaining the program.
Microsoft first unveiled the idea of broad-based courses in 2016 under the name Microsoft Professional Degree, and later renamed the idea as the Microsoft Professional Program.
The first track under the program was Data Science. Microsoft currently also offers Big Data, Front-End Web Development, Cloud Administration, DevOps, IT Support and Entry Level Software Development.
Posted by Scott Bekker on 04/02/2018 at 2:35 PM0 comments
Officials from the U.S. Federal Bureau of Investigation and Department of Homeland Security are warning network managers to be on the lookout for password-spray attacks.
Password spraying occurs when an attacker tests a single password against multiple user accounts at an organization. The method often involves weak passwords, such as Winter2018 or Password123!, and can be an effective hacking technique against organizations that are using single sign-on (SSO) and federated authentication protocols but that haven't deployed multifactor authentication.
By hitting multiple accounts, the method can test a lot of user names without triggering account-lockout protections that kick in when a single user account gets hit with multiple password attempts in a row.
"According to information derived from FBI investigations, malicious cyber actors are increasingly using a style of brute force attack known as password spraying against organizations in the United States and abroad," the agencies declared in a US-CERT technical alert issued Tuesday evening.
Prompting the alert was the disclosure last Friday of a federal indictment against nine Iranian nationals associated with the Mabna Institute, a private Iran-based company accused of hacking on behalf of the Iranian state. The main focus of that indictment was a massive, four-year spear-phishing campaign to steal credentials from thousands of university professors whose publications could allegedly advance Iranian research interests.
Also caught up in the alleged Iranian effort were 36 private companies in the United States, 11 companies in Europe and multiple U.S. government agencies and non-government organizations, and the method of attack for those organizations was password spraying.
According to the indictment:
In order to compromise accounts of private sector victims, members of the conspiracy used a technique known as 'password spraying,' whereby they first collected lists of names and email accounts associated with the intended victim company through open source Internet searches. Then, they attempted to gain access to those accounts with commonly-used passwords, such as frequently used default passwords, in order to attempt to obtain unauthorized access to as many accounts as possible.
Once they obtained access to the victim accounts, members of the conspiracy, among other things, exfiltrated entire email mailboxes from the victims. In addition, in many cases, the defendants established automated forwarding rules for compromised accounts that would prospectively forward new outgoing and incoming email messages from the compromised accounts to email accounts controlled by the conspiracy.
The US-CERT technical alert refers to the indictment as having been handed up in February, which could explain Microsoft's detailed guidance for deterring password-spray attacks in a high-profile blog post on March 5. In that post, Alex Simons, director of program management for the Microsoft Identity Division, called password spray "a common attack which has become MUCH more frequent recently," and declared, "Password spray is a serious threat to every service on the Internet that uses passwords." The new government alert linked back to the March 5 Microsoft post as a mitigation resource.
While the Mabna-related password spraying clearly has a lot to do with the new alert, US-CERT warned that others are currently using the attack. "The techniques and activity described herein, while characteristic of Mabna actors, are not limited solely to use by this group," the alert stated.
This is US-CERT's third technical alert this year. Previous alerts warned about the Meltdown and Spectre side-channel vulnerability and Russian government cyberactivity targeting critical U.S. infrastructure.
Posted by Scott Bekker on 03/28/2018 at 8:37 AM0 comments
Security patches from January to protect Windows 7 from Meltdown opened up a different, gaping security flaw in the way the operating system protected memory, according to a security researcher who specializes in direct memory access (DMA) attacks.
Ulf Frisk revealed the vulnerability on Tuesday on his personal blog in a post called "Total Meltdown?" The patch was intended to address the Meltdown flaw in Intel, IBM POWER and ARM-based processors that emerged in January and theoretically allows a rogue process to read all memory on a system.
"[The patch] stopped Meltdown but opened up a vulnerability way worse...It allowed any process to read the complete memory contents at gigabytes per second, oh -- it was possible to write to arbitrary memory as well," wrote Frisk, who is the author of the PCILeech memory access attack toolkit, and who described himself in a DEFCON 24 presentation in 2016 as a penetration tester specializing in online banking security and working in Stockholm, Sweden.
"No fancy exploits were needed. Windows 7 already did the hard work of mapping in the required memory into every running process. Exploitation was just a matter of read and write to already mapped in-process virtual memory. No fancy APIs or syscalls required -- just standard read and write," Frisk said.
The flaw does not affect Windows 10 or Windows 8, according to Frisk.
The problem appears to have been introduced by the Windows 7 patches released in January, during the industrywide scramble to address the Meltdown and related Spectre flaws whose existence was revealed slightly ahead of schedule. Some of the first-generation patches caused reboot and slowdown issues, among other problems.
Frisk said the subsequent March patch for Windows 7 fixed the flaw, and he discovered the problem after the March patch was released.
Posted by Scott Bekker on 03/27/2018 at 10:27 AM0 comments
Does Microsoft have a shot in the race to be the first trillion-dollar company?
Apple, Amazon and Alphabet (Google) have been front-runners in investor speculation about which company could be first to reach the psychological milestone of a trillion-dollar market capitalization.
Attention around the question peaked near the market's recent top in January and has settled considerably as stocks have fallen since. In addition, Facebook, which had been a little further back in the market cap sweepstakes, has completely worked its way out of the conversation in the midst of its recent storm of controversy over data privacy that has severely affected the stock price.
An analyst at Morgan Stanley revived the tech market cap question on Monday with a high-profile note to clients predicting Microsoft will reach a $1 trillion market cap within 12 months.
"Strong positioning for ramping public cloud adoption, large distribution channels and installed customer base, and improving margins support a path to $50 billion in EBIT and a $1 trillion market cap for MSFT," said Morgan Stanley's Keith Weiss in a note quoted by CNBC.
Shares of MSFT rose more than 5.5 percent after Morgan Stanley's note.
Here are the companies' relative market caps, according to Yahoo! Finance:
- Apple: $854 billion
- Amazon: $734 billion
- Alphabet (Google): $712 billion
- Microsoft: $710 billion
- Facebook: $453 billion
Posted by Scott Bekker on 03/26/2018 at 10:33 AM0 comments
Microsoft has been steadily incorporating more and more of its enterprise intelligence chops into its business applications, and Dynamics 365 looks poised to be one of the effort's biggest beneficiaries.
At Wednesday's Business Forward event in Amsterdam, Microsoft unveiled details and highlights of the upcoming Spring '18 release of Dynamics 365. "We're unleashing a wave of innovation across the entire product line with hundreds of new capabilities and features in three core areas: new business applications; new intelligent capabilities infused throughout; and transformational new application platform capabilities," said James Phillips, corporate vice president of the Microsoft Business Applications Group, in a blog post unveiling the changes.
One hotly anticipated component that will be generally available on April 2, when many of the capabilities of the spring release are set to begin rolling out, is the overdue Dynamics 365 for Marketing application. "This is a new marketing automation application for companies that need more than basic email marketing at the front end of a sales cycle to turn prospects into relationships," Phillips said of the component, which was originally announced in October 2016 and was supposed to ship a year ago.
Along the same lines of a more basic experience for customers with less intensive needs, Microsoft is also rolling out a new module called Dynamics 365 for Sales Professionals on April 2. Phillips described the Sales Professional version as a streamlined version of Dynamics 365 for Sales, with an emphasis in the new version on core salesforce automation capabilities. "From opportunity management to sales planning and performance management, the solution optimizes sales processes and productivity," Phillips said.
New Intelligence Capabilities
The spring release is also productizing the years of work and millions invested in artificial intelligence research, Phillips said. "These investments are infused throughout Dynamics 365 and are now available with the spring 2018 release," he said.
The highest-profile examples are in a feature set Microsoft is calling "embedded intelligence" in the Dynamics 365 for Sales application. Microsoft previously referred to the feature set as Relationship Insights. The idea is that embedded intelligence leverages information created in the sales process to recommend actions. The initial spring release on April 2 will include a relationship assistant, auto capture and e-mail engagement. Relationship Assistant analyzes customer interactions in Dynamics 365, Exchange and other sources to generate action cards that suggest next steps. Auto-Capture takes a salesperson's Outlook messages and appointments that relate to Dynamics 365 deals and offers to track them. E-mail Engagement tracks whether recipients open messages and attachments, click through links or reply to messages, and allow scheduling e-mails and reminders.
Common Data Service for Analytics and Apps
The launch will also include previews for a new set of data integration services built on the common data model -- one for Power BI and one for PowerApps.
The Common Data Service (CDS) represents another Microsoft run at the age-old problem of integrating data from multiple sources and trying to wrangle actionable business intelligence out of the combined data. "The CDS for Analytics capability will reduce the complexity of driving business analytics across data from business apps and other sources," said Arun Ulag, Microsoft general manager of Intelligence Platform Engineering, in a blog post. Common Data Service for Analytics works with Power BI.
Ulag said CDS for Analytics expands Power BI with the introduction of an extensible business application schema. "Pre-built connectors for common data sources, including Dynamics 365, Salesforce and others from Power BI's extensive catalog, will be available to help organizations access data from Microsoft and third parties. And organizations will be able to add their own data," he said.
One of those pre-built Power BI apps, designed for Dynamics 365 for Sales, is supposed to enter the public preview stage during the second quarter of this year. Called Power BI for Sales Insights, the app will provide relationship analytics. The purpose is to help salespeople manage pipeline by using AI to rate the health of customer relationships with techniques including sentiment analysis. Another CDS for Analytics-based Power BI app coming to public preview in the second quarter is called Power BI for Service Insights.
On the Power Apps side, Microsoft is unveiling a preview of Common Data Service for Apps on April 2. When it ships, it will come with PowerApps and offer capabilities for modeling business solutions within platforms like Dynamics 365 and Office 365.
Others of the hundreds of new features in the spring release aim to unify Microsoft's business applications and improve integrations with Microsoft technologies, including Outlook, Teams, SharePoint, Stream, Flow, Azure, LinkedIn, Office 365 and Bing. Microsoft will be providing more detail on March 28 in a Business Applications Virtual Spring Launch Event.
Posted by Scott Bekker on 03/21/2018 at 1:51 PM0 comments
Intel's massive effort to protect all of the chips it has released in the past five years against Spectre and Meltdown is now finished.
The company announced its completion of the microcode updates on Thursday, adding that it has also redesigned the processors being released later this year to offer additional protections.
"We have now released microcode updates for 100 percent of Intel products launched in the past five years that require protection against the side-channel method vulnerabilities discovered by Google," said Intel CEO Brian Krzanich in a statement.
The declaration would bring to a close a promise Krzanich made in a keynote at CES in the second week of January just after news broke that Intel and its OEM and software partners were working feverishly to fix the flaws, which represented a serious theoretical threat but did not seem to have been exploited in the wild.
At the time, Krzanich said Intel expected to issue fixes for 90 percent of its processors within a week and fixes for all of them by the end of January. However, complications arose involving bricked systems, server performance issues and reboot problems.
While Intel is done working on the microcode, that doesn't necessarily mean all systems can be patched yet. Because customers get the fixes through their OEMs rather than from Intel, it could still take time for some of Intel's OEMs to test and approve the patches on their supported systems.
At the same time, Intel redesigned forthcoming processors shipping later this year to address two of the three variants of the Spectre/Meltdown family identified by Google Project Zero's reporting.
"While Variant 1 will continue to be addressed via software mitigations, we are making changes to our hardware design to further address the other two. We have redesigned parts of the processor to introduce new levels of protection through partitioning that will protect against both Variants 2 and 3," Krzanich said Thursday. "These changes will begin with our next-generation Intel Xeon Scalable processors (code-named Cascade Lake) as well as 8th Generation Intel Core processors expected to ship in the second half of 2018."
Posted by Scott Bekker on 03/15/2018 at 2:29 PM0 comments
In a flaw described by one non-affiliated security expert as "fascinating," security researchers found a logical flaw in the Credential Security Support Provider (CredSSP) protocol used by Remote Desktop and WinRM and affecting all supported versions of Windows.
Preempt Security reported the flaw to Microsoft last August and Microsoft released a fix this week as part of the March Patch Tuesday release. The flaw, CVE-2018-0886, was rated "important" by Microsoft, which is a middling severity designation in Microsoft's scale, largely because the new flaw is not an initial infection vector.
Instead, an attacker needs to already be inside the network and set up a man-in-the-middle (MITM) attack via methods that could include ARP Poisoning or even the new WPA2 vulnerability known as KRACK.
CredSSP is designed to securely forward a user's full credentials to a target server. The flaw relies in part on the fact that the client trusts the public key provided by the server. In the case of an RDP connection, an attacker would intercept the initial connection request from the client and return a malicious command to the client, which assumes the command is actually a valid public key from the server and signs it. That signed version is passed by the MITM back to the server, which executes the malicious code -- now signed by the client -- on the server.
Preempt positions the flaw as a technique for lateral movement and privilege escalation. One of the most severe scenarios would be if the attacker intercepts an attempt by an administrator to remotely log on to a domain controller.
"This vulnerability is a big deal, and while no attacks have been detected in the wild, there are a few real-world situations where attacks can occur," said Roman Blachman, Preempt CTO and co-founder, in a statement. Preempt also posted a video showing how the attack works and a technical blog post. "Ensuring that your workstations are patched is the logical, first step to preventing this threat. It's important for organizations to use real-time threat response solutions to mitigate these types of threats," Blachman said.
Dustin Childs of the Zero Day Initiative at Trend Micro highlighted the patch in his analysis of Microsoft's Patch Tuesday release, which included 14 updates resolving 78 unique vulnerabilities. "This patch corrects a truly fascinating bug," Childs wrote of the CredSSP flaw. "It's important to understand this is not a constrained delegation. CredSSP passes the user's full credentials to the server without any constraint. That's a key to how an attacker would exploit the bug."
Childs also warned that applying the patch isn't enough to be fully protected. "Sysadmins must also enable Group Policy settings on their systems and update their Remote Desktop clients. While these settings are disabled by default, Microsoft does provide instructions to enable them. Of course, another alternative is to completely disable RDP, but since many enterprises rely on this service, that may not be a practical solution," he wrote.
Microsoft also released a support document that describes the steps required to update Group Policy or Registry settings to protect against the flaw. In a related step, Microsoft plans to update the Remote Desktop Client next month to provide more detail in error messages when an updated client fails to connect to a server that has not been updated.
A team from Preempt will give a presentation on the vulnerability at Black Hat 2018 Asia next week.
Posted by Scott Bekker on 03/14/2018 at 10:10 AM0 comments
Microsoft on Monday marked the one-year anniversary of rolling out Microsoft Teams by introducing a raft of new features coming to the teamwork hub through this calendar year.
Microsoft launched Teams on March 14, 2017, as an answer to Slack, and more recently has disclosed that Teams will merge over time with Skype for Business. By launching Teams as a component of Office 365, Microsoft quickly exposed the new platform to the cloud productivity suite's broad base of 120 million users. Microsoft did not provide an update on Monday for how many users Teams has, but the company did report that 200,000 organizations are now using Teams.
After a rough start to 2018 within Microsoft and across the industry, Microsoft's digital voice assistant Cortana will get some attention from Teams engineers at Microsoft. Microsoft plans to add voice integrations within Teams that will allow users to speak with natural language to make a call, join a meeting or add other people to a meeting. The functionality is planned at first for IP phones and conference room devices.
In addition to Cortana integration, other features coming this year include background blur on video, inline message translation, proximity detection for Teams Meetings and mobile sharing in meetings.
The background blur will be an appealing feature for anyone calling into a meeting when they've got an unprofessional scene behind them or a background that they'd otherwise like to keep meeting participants from seeing. Blurring is one approach to the issue. Another approach, from Zoom Video Communications, is a Virtual Background for videoconferencing that allows users to select and display an image, such as a cityscape, behind them during a meeting.
Inline message translation presumably will leverage translation and transcription services in Azure to make posts readable to participants who speak different languages in chats and in channels, which is the Teams term for topic-based discussions among members of a team. With users in 181 Microsoft-defined markets around the world, the translation feature could get heavy use.
The proximity detection feature is designed to help users find and add a Skype Room System. A more universally useful feature will be mobile sharing, which will let attendees share live video streams, photos or their mobile screen.
Microsoft also disclosed a new enterprise calling feature to be available by the end of June called Direct Routing. While the specifics are complicated and have a lot of dependencies on both Microsoft products and third-party infrastructure, Direct Routing will be a way for customers to use existing telephony infrastructure with Teams for calling. In that sense, Direct Routing joins Microsoft Calling Plans as ways for customers to enable calling from Teams. More detail on Direct Routing is available here.
On the anniversary, Microsoft also highlighted some previously disclosed elements of the Teams roadmap. One is cloud recording, a one-click meeting recording option that will automatically transcribe and timecode a meeting. Features include the ability to read captions, search the conversation and play back the meeting. Later, Microsoft plans to add facial recognition to automate attribution of comments to specific attendees. Parts of the calling roadmap that Microsoft highlighted again on Monday included consultative transfer and call delegation.
Although they weren't reinforced on Monday, Microsoft has previously discussed a number of features coming by the end of June. For meetings, those features include broadcast meetings, federated meetings, large meeting support for about 250 participants, a lobby for PSTN callers, Outlook meeting schedules from other platforms, PowerPoint loading and sharing, whiteboard and meeting notes, user-level meeting policies for IT professionals, and e-discovery enhancements.
On the calling side, Microsoft has publicly talked about 2018 availability for call support between Teams and Skype Consumer, distinctive rings, call queues, "do not disturb" breakthrough, forwarding to group, call parking and group call pickup. (For more background on Teams-Skype integration, listen to the Redmond Tech Advisor webcast with Office 365 and SharePoint MVP Christian Buckley from December.)
Posted by Scott Bekker on 03/12/2018 at 12:35 PM0 comments
Kali Linux, the distribution dedicated to penetration testing and a favorite of hackers wearing white, gray and black hats all around the world, just hit the Microsoft App Store.
What that means is that Windows 10 users can now quickly download and install the distribution for free and be running the powerful security testing platform in a matter of minutes.
Tara Raj, a program manager at Microsoft who works with the Windows Subsystem for Linux (WSL), announced availability of Kali Linux in the Microsoft Store in a blog post on Monday. "We are happy to officially introduce Kali Linux on WSL," Raj wrote. She noted "great interest" in Kali among the WSL community after Offensive Security, the security and training company that maintains Kali Linux, posted a tutorial in January for getting the OS running in WSL.
The app-ified experience within the Microsoft Store simplifies and speeds up the installation process, but, somewhat paradoxically, Kali within the WSL is a far less intuitive experience for a Windows user than running the pentesting distribution on a dedicated system, on a Live USB stick, or in a virtual machine.
Downloading Kali from the Microsoft Store is relatively quick. Users who haven't tried the Linux subsystem need to enable WSL first. It's a relatively quick process involving running PowerShell as an admin, pasting in one line of code and restarting the system. (Click here to watch Offensive Security's video setup walkthrough, which includes enabling WSL.)
Next, navigate to the Microsoft Store, search for Kali Linux and press the "Get" button. A short 134MB download later brings a prompt to "Launch" Kali or to "Pin to Start".
Once Kali is launched for the first time, the Microsoft Store process takes care of several steps on the user's behalf. Compared to Offensive Security's January tutorial video for running Kali on WSL, downloading Kali Linux from the Microsoft Store seems like it cuts out about half of the previously required commands.
In as little as a few seconds, a command window opens, the installation finishes, and the user gets a prompt to create a regular user account and enter a password.
This is the spot where Kali Linux on WSL is less intuitive for a Windows native than actually running Kali in a full-on Linux environment would be, for several reasons.
First, once Kali Linux is installed on Windows, you're looking at a blinking command-line cursor. This is an unforgiving command-line environment where you need to have a rock-solid understanding of Linux commands and Linux file structures in order to do anything.
By comparison, Kali in its native Linux environment actually boots into an attractive GUI. Power users may want to operate primarily in the terminal, but beginners can point and click, navigate files and folders graphically, and explore the interface.
The next way the WSL version is limiting for new users is spelled out in the Microsoft Store description: "This image contains a bare-bones Kali Linux installation with no penetration testing tools -- you will need to install them yourself." Users must know what penetration testing tools to look for, where to find them, and how to download and install them.
The default Kali Linux installation, on the other hand, is an inviting interface that encourages exploration. Dozens of attack tools are preloaded and organized logically by function. A user can drag down the Applications menu in the upper-left and browse tools for Information Gathering, Vulnerability Analysis, Password Attacks, Wireless Attacks, Exploitation Tools, Social Engineering Tools and others.
One other caveat in the WSL version mentioned in the Microsoft Store description: "Some tools may trigger antivirus warnings when installed, please plan ahead accordingly." For example, the endpoint protection software on my system was not a fan of several files that Kali WSL tried to download while installing Metasploit, such as Trojan.Gen.2, OSX.Trojan.Gen, Meterpreter or Hacktool, among others. They all got quarantined and, I suspect, prevented Metasploit from launching properly.
For users with intermediate-level Linux skills and strong familiarity with the capabilities of various penetration testing tools in Kali Linux and how to load those tools, this app is a great addition to the Windows Store. It has simplified installation and has brought Kali Linux squarely into the everyday Windows desktop. If you know what you're doing and what you want to do, it can be handy to have that Kali terminal running right inside your Windows environment for easy access.
For those who haven't used Kali much or at all and are interested in learning what its frightening and impressive capabilities might reveal about the security of their corporate environments, the WSL version is less useful. In that case, it's still worth the trouble of jumping through the installation hoops to get a regular Kali environment running on a dedicated physical machine or virtual machine.
Posted by Scott Bekker on 03/07/2018 at 9:26 AM0 comments
The server business was booming in the fourth quarter of 2017, according to market research from IDC. PCs and smartphones, not so much.
IDC released a slew of reports this week recapping the most recently completed quarter, now that most of the publicly traded vendor companies have released their quarterly financial reports, with all those reports' attendant clues.
Server market revenues jumped 26 percent year over year to $20.7 billion in the fourth quarter. IDC attributed the momentum to several factors, such as traction for the Purley-based offerings from Intel and the EPYC-based offerings from AMD. The overall server market showed some signs of life, as well, with server shipments increasing nearly 11 percent to 2.84 million units for the quarter.
Yet the factor propping up the server market overall remains the shift in computing from distributed at client sites to centralized at megavendor datacenters.
"Hyperscalers remained a central driver of volume demand in the fourth quarter with leaders such as Amazon, Facebook, and Google continuing their datacenter expansions and updates," said Sanjay Medvitz, senior research analyst for servers and storage at IDC, in a statement. "ODMs [original design manufacturers] continue to be the primary beneficiaries from hyperscale server demand. Some OEMs are also finding growth in this area, but the competitive dynamic of this market has also driven many OEMs such as HPE to focus on the enterprise."
By manufacturer, the HPE/New H3C Group joint venture was tied with Dell for the quarterly revenue lead, followed by IBM, Lenovo and Cisco. Taken as a group, ODM Direct vendors had a slightly bigger share of revenues than either of the leaders.
The picture for personal computing devices, which IDC defines as desktops, notebooks, slates and detachables, wasn't as positive. IDC is projecting that for the full year of 2017, shipments within the sector declined 2.7 percent. IDC published forecasts out through 2022, and expects compound annual growth for the entire sector to be a paltry 0.1 percent over the period. Short-term, IDC is looking for another drop in 2018 of a little more than 3 percent, with slight pickups thereafter due to corporate refresh cycles, and the ongoing popularity of detachables like the Microsoft Surface.
As for smartphones, IDC reports that 2017 marks the first year-over-year decline for the devices, which are now in a two-horse race between Android and iOS. The 1.46 billion devices that IDC estimates shipped in 2017 represented a half-a-percent drop in volume compared to 2016. Through 2022, IDC forecasts a compound annual growth rate of a little under 3 percent.
Posted by Scott Bekker on 03/02/2018 at 9:17 AM0 comments
In a lively one-hour discussion ranging from privacy rights to latency issues to robots conducting overseas seizures, U.S. Supreme Court justices sparred with lawyers from Microsoft and the U.S. government in oral arguments on Tuesday.
At issue: whether a U.S. court can order a U.S.-based e-mail service provider to comply with a probable-cause-based warrant issued under the 1986 Stored Communications Act (SCA) by disclosing e-mails that the provider has stored abroad.
State of play leading up to the Supreme Court has Microsoft ahead and playing defense. The case started with a Drug Enforcement Agency investigation in 2013. Federal agents persuaded a magistrate judge in the Southern District of New York to issue a warrant for a suspect's e-mails. Microsoft fought the order on the grounds that the e-mails were stored at its datacenter in Ireland. A U.S. District Court rejected Microsoft's appeal, but the U.S. Court of Appeals for the 2nd District ruled in Microsoft's favor.
Discussion on Tuesday settled over and over on a few key topics: the many ways that the outdated SCA is woefully inadequate for the cloud era; whether the court should simply wait for pending congressional legislation to make the questions in the case moot; justices seeking clarification on what exactly happens in the United States and abroad when Microsoft or other service providers produce an e-mail record; domestic versus extraterritorial jurisdiction questions; and back-and-forth about the legal differences between warrants, subpoenas, orders, searches and disclosures.
What Microsoft wants is for the Supreme Court to leave the issue alone and to hope that Congress passes the CLOUD Act, introduced recently with bipartisan and tech industry support.
"There were conversations about where the Internet is headed," Microsoft lawyer E. Joshua Rosenkranz said Tuesday in his closing statement. "There [are] conversations about whether this will kill the tech sector, how much of an international consensus there is about the sovereignty of data. These are all questions that only Congress can answer. Meanwhile, this Court's job is to defer, to defer to Congress to take the path that is least likely to create international tensions. And if you try to tinker with this, without the tools that -- that only Congress has, you are as likely to break the cloud as you are to fix it." (Ed.'s note: All quotations in this article are taken from the 72-page official transcript posted on the Supreme Court's Web site.)
Arguing for the government, Michael R. Dreeben, deputy solicitor general for the U.S. Department of Justice, countered that the court should move before Congress to fix an unsettled legal environment.
Calling Microsoft's position "radical," Dreeben described the current situation as one where no U.S. court gets to try to balance U.S. law with other countries' relevant laws. "If the data is stored overseas, we're just out of luck. We can't even ask a court for an order that would require its production," Dreeben said.
"No other court that has issued a written opinion since Microsoft has agreed with the Second Circuit. And the Second Circuit's decision has caused grave and immediate harm to the government's ability to
enforce federal criminal law," Dreeben argued.
He also urged the court not to wait for the CLOUD Act: "But as to the question about the CLOUD Act, as it's called, it has been introduced. It's not been marked up by any committee. It has not been voted on by any committee. And it certainly has not yet been enacted into law."
Predicting how justices will decide from the questions they ask in oral arguments is tricky, but there were some hints. Running through the justices in rough order from the liberal to the conservative end of the spectrum:
Justice Sonia Sotomayor asked Dreeben outright why the court shouldn't wait for Congress. "Why shouldn't we leave the status quo as it is and let Congress pass a bill in this new age?" Sotomayor also participated with several of the justices in lengthy exchanges to understand better how Microsoft would technically go about complying with an order to produce e-mails from a U.S. office that are stored in a datacenter in Ireland. At one point, Rosenkranz described the process as similar to dispatching a robot, saying, "If you sent a robot into a foreign land to seize evidence, it would certainly implicate foreign interests." Shortly after that description, Sotomayor joked, "I'm sorry...I guess my imagination is running wild."
Justice Ruth Bader Ginsburg offered similar thoughts on leaving action to Congress: "[In] 1986, no one ever heard of clouds. This kind of storage didn't exist. ... Wouldn't it be wiser just to say let's leave things as they are; if -- if Congress wants to regulate in this brave new world, it should do it?"
Justice Elena Kagan's questions were relatively technical, covering issues around whether judges could weigh other countries' laws in deciding on challenges to warrants, and discussing legislators' intent for specific provisions of the SCA.
Justice Stephen Breyer sought a short-circuit for the whole issue in trying to pin down whether Magistrate Court judges had authority to issue warrants for searches outside their geographic districts -- in this case, New York. "I suspect [that] it just can't be that easy, this case," Breyer said during a light moment in the arguments. Breyer also asked about the feasibility of a middle path involving reading the old statute to adapt to the current cloud environment.
Justice Anthony Kennedy wondered why the discussion about location wasn't broader. "Why should we have a binary choice between a focus on the location of the data and the location of the disclosure? Aren't there some other factors, where the owner of the e-mail lives or where the service provider has its headquarters?"
Justice Samuel Alito came down pretty heavily on the side of action -- the government's preferred position. "It would be good if Congress enacted legislation that modernized this, but in the interim, something has to be done," Alito said. Meanwhile, another question Alito asked established definitively that the nationality of the suspect in the case was not known, which may influence Kennedy's thinking based on his questions about locations. Alito also pressed Microsoft's Rosenkranz about what would happen in a case involving American citizens being investigated for crimes committed in the United States if their service providers store their e-mails outside the country.
Chief Justice John Roberts expressed deep reservations about service providers intentionally using the current legal standard to assist customers in avoiding U.S. investigators.
"There is nothing under your position that prevents Microsoft from storing United States communications, every one of them, either in Canada or Mexico or anywhere else, and then telling their customers: Don't worry if the government wants to get access to your communications; they won't be able to, unless they go through this MLAT [Mutual Legal Assistance Treaties] procedure, which is costly and time-consuming," he said. "Could you provide that service to your customers?"
In a give-and-take discussion, Rosenkranz assured Roberts that Microsoft's motives solely involved customer demands for minimizing latency, which he positioned as the sole reason for Microsoft's investment in half-billion-dollar datacenters all around the world. Roberts did not sound convinced, "Well, but you might gain customers if you can assure them, no matter what happens, the government won't be able to get access to their e-mails."
Justice Neil Gorsuch also seemed to stick to technical questions on subjects like the chain of activity in complying with a court order and the differences between subpoenas and warrants. At one point, Justice Breyer seemed to indicate to Dreeben that Gorsuch and others were "with you on this" but it was unclear exactly what Breyer was talking about.
Justice Clarence Thomas provided no clues as to his thinking during the oral arguments. He upheld his standard practice of asking no questions.
So the quick scorecard from this close read of the transcript is Sotomayor and Ginsburg leaning toward waiting for Congress, Alito and Roberts inclined to act, and the other five justices on the fence. Stay tuned for the decision in June.
Posted by Scott Bekker on 02/28/2018 at 7:01 AM0 comments
Microsoft attorneys will make their arguments before the U.S. Supreme Court on Tuesday in their final opportunity to sway an e-mail privacy case that is central to the willingness of international customers to trust U.S.-based cloud providers with their data, among numerous important legal issues.
The face-off between lawyers from Microsoft and the Trump administration, which is carrying on with arguments from the Obama administration, involves an e-mail privacy case stemming from a drug investigation in 2013.
At the time, federal agents sought and obtained a warrant for a suspect's e-mails from a magistrate judge in the Southern District of New York. Microsoft, reeling from international backlash to Edward Snowden document revelations that detailed cooperation by several technology giants with U.S. intelligence agencies, fought the order.
Microsoft argued that because the e-mails were stored at its datacenter in Ireland, the U.S. warrant didn't apply. The challenge failed in U.S. District Court, but succeeded in the U.S. Court of Appeals for the 2nd District. The government appealed, and the Supreme Court agreed last October to hear the case.
Arguments center on a 1986 law called the Stored Communications Act that was drafted before cloud datacenters or even the widespread use of e-mail. In short, Microsoft contends that the U.S. government does not have the right to unilaterally demand e-mails held by a U.S. provider in a datacenter in another country, noting that the government has never suggested that the account holder lived in the United States or was a U.S. citizen. Instead, Microsoft argues that the U.S. government should cooperate with courts and law enforcement in other countries to obtain data held in those places.
The government's case is that a U.S. company can essentially press a button in the United States to deliver materials to government investigators, making the question of where the data resides theoretical. Government lawyers also argue that systems like Google's that involve slicing up data and storing it all over the world make questions about where the data resides even murkier.
The closely watched case has drawn more than 30 friend-of-the-court briefs from other tech firms, privacy advocates, the European Union, the U.S. Chamber of Commerce, and former law enforcement and national security officials. Additionally, 33 states had urged the high court to take the case over concerns that the appeals court ruling allowed a private company to shield evidence from law enforcement.
After the arguments Tuesday, a Supreme Court decision is expected by the end of June.
Posted by Scott Bekker on 02/26/2018 at 8:20 AM0 comments
Tying together various threads uncovered by themselves and other security companies over the last few years, security researchers at FireEye have concluded that a series of attacks represent a discrete cyber-espionage group operating on behalf of North Korea.
FireEye named the group APT37 in a report released this week, "APT37 (Reaper): The Overlooked North Korean Actor." The report connects APT37 to other attacks dating back to 2014, including the recent zero-day vulnerability CVE-2018-4878 that was disclosed on Feb. 1. Successful exploitation of that Adobe Flash Player vulnerability could allow an attacker to take control of an affected system.
FireEye's report ties that vulnerability to activities reported by other researchers, including Kaspersky Lab, which identified a group of attackers as ScarCruft, and Cisco's Talos unit, which identified the activities of a Group 123. The FireEye report goes further in pinpointing the group's origin as North Korea.
"We assess with high confidence that this activity is carried out on behalf of the North Korean government given malware development artifacts and targeting that aligns with North Korean state interests," FireEye wrote in the introduction to the report.
"We judge that APT37's primary mission is covert intelligence gathering in support of North Korea's strategic military, political and economic interests. This is based on consistent targeting of South Korean public and private entities and social engineering. APT37's recently expanded targeting scope also appears to have direct relevance to North Korea's strategic interests."
What's interesting about the report is that FireEye views APT37 as separate from the internationally isolated country's main suspected cyber-espionage and operations unit, which researchers call Lazarus. According to FireEye, the capabilities of APT37 are increasing, the unit's international scope of operations is expanding, and the group is likely to become another tool in North Korea's global cyber-operations arsenal.
Posted by Scott Bekker on 02/21/2018 at 8:44 AM0 comments
Peter Bauer's perch offers a commanding view of one of the greatest migrations in the history of IT -- the movement from on-premises Microsoft Exchange servers to Office 365.
Bauer is chairman and CEO of Mimecast, which provides e-mail security and data security products. Much of the company's business involves layering security and archiving onto Microsoft Office 365, and the company has been building a business on the Microsoft cloud productivity trend for years.
In the earnings call about the company's third quarter results on Monday night, a financial analyst noted that Mimecast reported that 29 percent of its customers are on Office 365 and asked how Bauer saw the Office 365 opportunity progressing in the future.
The ideal source for that information would be Microsoft, but the software and cloud giant rarely provides straight numbers from quarter to quarter, and even more rarely discusses the overall universe of Exchange seats.
In answering, Bauer said he believes Mimecast's customer base is a pretty good proxy for Office 365 adoption.
"When we talk about [Office 365] in the context of the broader Microsoft Exchange ecosystem, we estimated somewhere between 300 million and 350 million corporate e-mail users using a Microsoft-type solution for e-mail," Bauer said, according to a Seeking Alpha transcript of the call.
Bauer's figures roughly align with what Microsoft has publicly revealed about Office 365 monthly active users, which hit 120 million worldwide in October. Given Mimecast's estimate that about a third of the overall Exchange universe is on Office 365, the figure passes the back-of-the-envelope test.
Looking forward, Bauer said, "I don't know what the sort of saturation level is in terms of Microsoft's customers moving over. But if one assumes that 75 percent of the market is ultimately on [Office] 365 and if we're sort of a third of the way there after five-plus years of migration, [it] is probably at least another five maybe more years of migration that goes on in the markets."
In the face of Microsoft's vagueness, these estimates from Mimecast, a company with a lot at stake financially in the getting the right answer to the question and a sizable user base of its own to compare against, provides a valuable field report from the overall Office 365 migration.
Posted by Scott Bekker on 02/14/2018 at 8:57 AM0 comments
In the great debate over whether the robots will save us or destroy us, Microsoft CEO Satya Nadella is staking out a more activist position.
Nadella revisited the artificial intelligence issue in a speech at The Economic Club of New York on Wednesday. "I feel like sometimes we in tech, even, abdicate control: '[AI] is going to happen tomorrow and our best case is that we're going to be domesticated cats or whatever,'" Nadella said.
His comment references pessimistic statements like those from scientist Stephen Hawking, who told the BBC, "The development of full artificial intelligence could spell the end of the human race...It would take off on its own, and redesign itself at an ever-increasing rate." Or SpaceX/Tesla founder Elon Musk, who has written, "The risk of something seriously dangerous happening is in the five-year timeframe. Ten years at most."
Nadella countered Wednesday that "no, it's a choice. I'm not making fun of that as a consequence. It could happen, but only if we abdicate."
He acknowledged the reality of downsides, such as unintended consequences of automation, especially job displacement. In Nadella's view, however, the eventual behavior of AI depends on the values and actions of the people in the tech industry. "We as a society -- starting with Microsoft -- have to do some of our very best work at skilling...students in school or people who are displaced midcareer."
Those comments piggyback on a major theme of his recent book, "Hit Refresh," in which Nadella dedicated an entire chapter to the future of humans and machines.
As he wrote there, "We can't seem to get beyond this utopia/dystopia dichotomy. I would argue that the most productive debate we can have about AI isn't one that pits good vs. evil, but rather one that examines the values instilled in the people and institutions creating this technology."
At Microsoft, Nadella wrote that he is pushing the company's substantial AI-focused workforce to follow principles that AI must be "designed to assist humanity...be transparent...maximize efficiencies without destroying the dignity of people...be designed for intelligent privacy [and] have algorithmic accountability."
Posted by Scott Bekker on 02/08/2018 at 1:34 PM0 comments
Now that Windows 10 has surpassed Windows 7 for the first time in global usage statistics, it's not just the latest round of Microsoft's reigning OS passing the torch to the newest release. This time it marks the ascendance of the new model of Windows.
"This is a breakthrough for Microsoft," said StatCounter CEO Aodhan Cullen in a statement this month about the handover of the lead. StatCounter, a Web analytics company, tracks operating system, browser and screen resolution information for 10 billion visitors each month to more than 2 million sites worldwide.
Cullen is talking about Windows 10's January 2018 results as a breakthrough in terms of wresting control from Windows 7, which, like Windows XP before it, has proven to be an exceptionally sticky OS version for Microsoft.
In January, Windows 10 hit 42.78 percent share worldwide, compared to 41.86 percent for Windows 7. The worldwide milestone follows similar crossover points in the United Kingdom (June 2016) and the United States (January 2017).
"Windows 10 was launched at the end of July 2015 and Microsoft will be pleased to have put its Windows 8 experience behind it. However, Windows 7 retains loyalty especially amongst business users. Microsoft will be hoping that it can replace it a lot quicker than XP, launched back in August 2001, which only fell below 5% usage worldwide in June of 2017," Cullen said.
Perhaps more significant, though, is that Windows 10's leading place asserts Windows as a Service, with rolling updates and an end to the old cycle of major launches every three years, as the mainstream version of Windows in the world.
There was a lot of hand-wringing about what exactly was meant when Microsoft officials called Windows 10 the "last major release" of Windows. It doesn't mean much in terms of new features or security patches, which continue to flood in, and at a much more rapid pace than in the old Windows model.
What it does mean is that once users are on Windows 10, they're supported for the life of their device, theoretically protecting them, Microsoft and the Internet at large from the kinds of security issues that have emerged and spread after a still-popular OS has fallen out of support.
The model could have been a failure. It could have followed in the footsteps of Windows 8, which also demanded a lot of adjustment from customers, who, in the main, refused Microsoft's directives.
Instead, Windows 10 -- and Microsoft's new approach to version support -- is the leading type of Windows by market share.
Posted by Scott Bekker on 02/07/2018 at 12:24 PM0 comments
Microsoft's strong financial results for the second quarter of its fiscal year were once again a result of the company's relentless focus on transitioning to cloud.
The company released results after markets closed on Wednesday evening showing revenues of $28.9 billion, a 12% increase over the year-ago quarter, and operating income of $8.7 billion, a 10% bump. Net income was a loss of $6.3 billion, due to a $13.8 billion charge the company took related to the tax bill that Congress passed in December.
Financial analysts greeted the results positively on an earnings call Wednesday night, and MSFT briefly hit record highs in mid-day trading Thursday before falling slightly.
CEO Satya Nadella summarized the big picture for the quarter during the analyst call. "The intelligent cloud and intelligent edge paradigm is fast becoming a reality. Azure growth accelerated. LinkedIn growth accelerated. Microsoft 365 and Dynamics 365 are driving our growth and transforming the workplace. Xbox is reaching new customers with new offers," Nadella said.
An official statement from Amy Hood, executive vice president and chief financial officer of Microsoft, clarified how key cloud was for the results for the October through December period.
"We delivered another strong quarter with commercial cloud revenue growing 56% year-over-year to $5.3 billion," Hood said. The quarterly figure shows Microsoft is now tracking above the famous $20 billion annual cloud run rate that Nadella had set as an audacious goal and that Microsoft first hit last quarter.
While that quarterly cloud figure remains a fraction of Microsoft's overall $28.9 billion in revenues, it's a growing fraction. Cloud's function as a driver of revenue growth is apparent across several of Microsoft's businesses.
The most dramatic example of the trend occurred in the Intelligent Cloud segment. In that unit, server products and cloud services revenue increased by $967 million or 18%. But the cloud portion, represented by Azure revenues, nearly doubled with 98% growth. Server products licensed on-premises also saw revenue growth, but only by 4%.
The story was similar in both Office Commercial and Dynamics. Both segments had 10% overall growth, but Office 365 commercial revenue growth pulled its sector forward with 41% growth, while Dynamics 365 revenue growth galloped ahead at 67%.
Product segments without significant cloud components to bolster them generally languished. Windows revenue was up 1% with Windows OEM revenues up and Windows Commercial revenue down. Surface revenues were also up 1% on the strength of more premium devices sold, even as overall volumes decreased.
Meanwhile, cloud has its own costs, which Hood enumerated during the call. "Excluding LinkedIn, operating expenses increased on cloud engineering and sales capacity investments," Hood said, adding later, "As expected, our capital expenditures, including finance leases, increased sequentially to $3.3 billion due to higher levels of customer demand and usage for our cloud services."
Given cloud's impact on Microsoft's quarter, it's clear why Microsoft continues to invest heavily in cloud, even after the dramatic investments of an estimated $15 billion or more in this decade to kick-start its datacenter footprint.
Posted by Scott Bekker on 02/02/2018 at 8:59 AM0 comments
Office 365 administrators who enjoy torturing their own users will have a new toy to play with this quarter. The Attack Simulator for Office 365 Threat Intelligence is expected to enter a public preview any day now, according to a recent update of Microsoft's Office 365 Roadmap.
The simulator is one of a handful of key, near-term security enhancements in the Office 365 roadmap.
The attack simulator has the potential to be a very useful proactive defense tool for IT administrators. Unveiled at Microsoft Ignite in September and set for an imminent public preview, the simulator is a new feature of Office 365 Threat Intelligence.
That Threat Intelligence service, launched last April, provides real-time security insights on global attack trends culled from what Microsoft describes as billions of data points from its global datacenters, Office clients and other sources.
According to the roadmap, the attack simulator "enables admins to send simulated attacks (10-15 different attack categories including phish, brute force password cracking, etc.) to their end users to determine how they respond to attacks and determine if the right policies are in place to help mitigate real attacks."
Also close are some additional features for the Office 365 Secure Score, which was originally came out a year ago to allow organizations to get a base security score from Microsoft based on dozens of factors in Office 365 covering user behaviors and security settings. It's like a credit score for an organization's cloud collaboration security posture.
Now Microsoft is adding an "Industry Average Score," displaying average scores that a company can compare to their own score. Microsoft is also testing an "Active Seat Average Score and Reporting Updates" feature for the Office 365 Secure Score. That will allow customers to compare their score against the average score for organizations with a similar number of Office 365 active seats. The update will also help organizations compare their own score between two different dates and offer the option to search a list of actions.
Microsoft is also fine-tuning the Office 365 Message Encryption capabilities it released in September. The feature was designed to make sharing of encrypted and rights-protected messages more seamless. However, the original release applied additional message restrictions, such as Do Not Forward. With the new version, administrators in the Admin Portal, or users in their Outlook client, can choose "encrypt only," without any other message restrictions.
In another change set to arrive shortly, Microsoft will add malicious link protection for end users sending e-mails within the same organization. Office 365 Advanced Threat Protection Safe Links for internal e-mails will include time-of-click protection and other functionality of Safe Links, Microsoft said. Slightly later in the quarter, Microsoft plans to introduce Office 365 Cloud App Security -- App Permission Alerts. The feature will allow administrators to create policies to be alerted when a user grants permission to an application to access Office 365 information.
All of the security features are currently in the "in development" section of Microsoft's Office 365 Roadmap page. Although many are supposed to be released very soon, the rollout for the Office 365 user base is staged and can take weeks or months.
Posted by Scott Bekker on 02/01/2018 at 7:47 AM0 comments
A Symantec Norton survey released this month estimated that close to 1 billion people were affected by cybercrime in 2017.
Norton's exact figure is 978 million people, determined from a mammoth survey of 21,549 people in 20 countries (counting China and Hong Kong as separate countries) that was conducted in October. To reach such a massive number, Norton took an expansive view of cybercrime. Respondents were counted as victims if they answered that they had been hit by any of 20 different types of cybercrime.
Some were serious financial problems with quantifiable monetary costs, such as experiencing a ransomware attack, experiencing credit or debit card fraud, making an online purchase that turned out to be a scam, falling for a technical support scam, or losing a job or a promotion due to a social media posting that the victim did not post.
Others were serious problems that could lead to, but didn't necessarily involve, direct financial damages, such as being notified that your personal information was involved in a data breach, having an account password compromised, being a victim of identity theft, having a device infected by a virus or other security threat, having payment information stolen from a phone, clicking on a phishing e-mail or having financial information compromised from shopping online.
Also included were thorny situations that probably wouldn't lead to direct financial damages, but could take a lot of time and effort to fix. That category included unusual activity or unauthorized access to home Wi-Fi networks, social media accounts, e-mail accounts or smart home devices; location-based information being accessed without permission; having a child suffer online bullying; or having a child's online activity compromise the family's security.
By incident type, the biggest problem was malware infections, which were experienced by 36 percent of respondents. Malware was followed by password compromises at 18 percent, credit/debit card fraud at 17 percent, personal information compromised in a data breach at 16 percent, and unauthorized hacks of e-mail or social networking accounts at 16 percent.
The Norton survey's overall victim estimate, while large, passes the smell test. In fact, it could be conservative. After all, Yahoo revealed in October 2017 -- the same month that the Norton survey was conducted -- that 3 billion user accounts were impacted in a previously reported 2013 data breach that the company had originally thought affected 1 billion users. That total had to include a large percentage of system accounts tied to organizational departments or job roles, as well as multiple accounts tied to individual users, but still -- it's a lot of people.
The Norton survey comes at the question from a different angle, by having users report their own incidents. And that 978-million-victim estimate is an eye-opening figure. Norton estimated that the total population of the countries studied was 3.1 billion, meaning nearly a third of all the people in those countries were hit. The report further estimates that the online population in the study's geographies was 1.8 billion, putting victims at 54 percent.
In other words, more online denizens were hit by cybercrime in 2017, than were not.
Posted by Scott Bekker on 01/29/2018 at 8:42 AM0 comments
Former Citrix CEO and longtime Microsoft senior executive Kirill Tatarinov is joining the board of directors at Acumatica, a Bellevue, Wash.-based cloud ERP company.
On Acumatica's board, Tatarinov will be advising his former Microsoft colleague, Jon Roskill, who joined Acumatica as CEO in 2014 after capping a long Microsoft career with a stint as Microsoft's channel chief.
"Kirill is a very big proponent of advanced technology," Roskill said in a statement, "and his views align well with our intelligent ERP efforts on machine learning, natural user interfaces, and Blockchain. Having another technology advocate on the Board will continue to inspire our product development."
Tatarinov and Citrix parted ways in July after he held the job for about 18 months. Previously, he worked at Microsoft for 13 years, including a lengthy period running Microsoft Business Solutions/Dynamics, which includes the company's ERP and CRM products.
On the Acumatica board, the Moscow-born Tatarinov joins investor and technologist Serguei Beloussov, the executive chairman and co-founder of Acumatica. Beloussov, a native of St. Petersburg, Russia, is also the co-founder, CEO and chairman of the board of Acronis and executive chairman of the board and chief architect of Parallels.
Posted by Scott Bekker on 01/25/2018 at 4:26 PM0 comments
The massive Equifax breach dominated the security headlines last year, but Microsoft security experts are contending that Petya and WannaCrypt are representative of a dangerous new category of cyberattacks that emerged in force in 2017.
In a blog post Tuesday, Mark Simos, lead cybersecurity architect for the Microsoft Enterprise Cybersecurity Group, said the two attacks "reset our expectations" for how bad a cyberattack can be in terms of speed and scope of damage. Simos termed Petya and WannaCrypt, also known as WannaCry, as "rapid cyberattacks."
As a definition for this class of attacks, Simos wrote, "Rapid cyberattacks are fast, automated, and disruptive -- setting them apart from the targeted data theft attacks and various commodity attacks, including commodity ransomware, that security programs typically encounter."
To fit the bill, an attack must be rapid, spreading in minutes through an enterprise; automated, with no human interaction required; and disruptive, with intentional destruction or encryption of data and systems.
Both pieces of malware exploited vulnerabilities in Windows. Petya first appeared in early 2016 as a somewhat standard family of encrypting ransomware that encrypted hard drives, then prompted users for a Bitcoin payment.
The novel bits came in June 2017 in a severe cyberattack with worldwide effect, but that hit Ukraine especially hard and prompted suspicion that it was a targeted assault on that country's infrastructure. The Petya variant used in that case, also called NotPetya, spread through compromised tax preparation software common in Ukraine called MEDoc. NotPetya also used the EternalBlue exploit of a Windows Server Message Block vulnerability and other techniques to traverse networks. EternalBlue had been leaked by the Shadow Brokers hacker group in April 2017, and was widely believed to be a U.S. National Security Agency (NSA) hacking tool. Additionally, NotPetya encrypted the file system but solely to destroy a computer; there were no ransom requests.
WannaCry/WannaCrypt, which also spread via EternalBlue without user interaction, also did some severe and widespread damage for a few days in May before a security researcher accidentally discovered a kill switch.
Focusing on Petya, Simos said that particular rapid cyberattack surprised defenders in four ways. It used the supply chain to enter target environments via the MEDoc application instead of phishing or browsing. Petya employed multiple propagation techniques. The malware moved across networks very quickly, outpacing defenders' ability to detect and respond to the attack. Finally, the lack of an apparent ransom motive made the malware destructive.
Simos and Jim Moeller, principal consultant for Cyber Security at Microsoft, address the issues in an on-demand webinar called "Protect Against Rapid Cyberattacks (Petya [aka NotPetya], WannaCrypt, and similar)."
Posted by Scott Bekker on 01/24/2018 at 8:01 AM0 comments
So is Metalogix for sale, or isn't it?
Metalogix's SharePoint tools competitor AvePoint on Tuesday launched a Metalogix switch campaign with a blog post from Chief Revenue Officer Chris Larsen asserting that Metalogix was for sale.
"If you haven't already heard, Metalogix has put their company up for sale. If you are using any of their products, this potential change in ownership could have a significant impact on the continuity of your IT processes and policies for SharePoint and Office 365," Larsen wrote in the post.
Metalogix CEO Trevor Hellebuyck responded a day later with a blog post titled "Metalogix is Forever" that was not quite a denial of being for sale, but that also pushed back against AvePoint's assertions.
"We don't know what sparked their post, but we will recognize it for what it is: a thinly veiled attempt to capture customers who they couldn't otherwise attract with AvePoint solutions. We'll simply say that we are a successful private equity backed business that attracts a lot of attention. Sometimes we respond to that attention; many times we don't," Hellebuyck wrote.
Posted by Scott Bekker on 01/18/2018 at 12:13 PM0 comments
It's certainly been a rough start to 2018 for Microsoft's virtual assistant.
- Even inside Microsoft, Cortana's been getting some rejections. On Jan. 5, Microsoft discontinued a public preview of an integration between Cortana and Dynamics 365 that the company had previously promoted. The preview had put Dynamics 365 in Cortana's notebook, and Cortana had prompted users with relevant information about sales activities, accounts, opportunities and meetings.
- Cortana was supposed to be besties with Alexa right now. Microsoft and Amazon had announced back in August that people would be able to use Cortana on Windows 10 PCs to access Alexa and to use Alexa on the Amazon Echo and other Alexa-enabled devices to access Cortana. The two would become like a team of assistants, allowing Alexa to handle managing Cortana specialties like booking meetings or accessing work calendars when a user was near an Echo, and allowing Cortana to control Alexa specialties like shopping on Amazon.com or controlling smart home devices from a Windows 10 PC. The integration was supposed to be done by the end of the year. But the companies missed the deadline and have not provided a new target date.
- Alexa is elbowing its way onto Windows territory. During CES last week, Acer announced that it would be bringing Alexa to some of its Aspire, Spin, Switch and Swift notebooks starting in the United States in the first quarter of 2018, with broader availability coming in the middle of the year. Other OEMs have discussed Alexa integrations, as well.
- CES buzz in general was heavy on Alexa, with some Google Assistant thrown in. It was the second big Alexa year in a row for CES. Cortana, on the other hand, did not make any kind of splash at the show. Apple Siri was also a non-factor. Microsoft did try to generate some Cortana CES buzz by highlighting some reference designs from Allwinner, Synaptics, TONLY and Qualcomm.
- Outsiders haven't been bothering to teach Cortana many new skills. As All About Microsoft's Mary Jo Foley pointed out in mid-December, Cortana is seriously lagging behind Alexa in the skills department. Microsoft released the Cortana Skills Kit in May 2017, and take-up has been slow. Alexa had 25,784 skills to start 2018, according to Voicebot.ai. Cortana had just 230 as of mid-December. The enthusiasm level is reminiscent of Microsoft's efforts to get modern apps for Windows 8 and apps for Windows Phone -- a slow, late start.
That Cortana is far behind while there's a lot of excitement about voice assistants is not surprising.
For one thing, she's on the wrong platform. Cortana launched as a public face of Windows Phone, and a good one too. With a backstory and fan base from the "Halo" video game franchise, the name was an inspired choice with a built-in personality to draw upon. But Windows Phone went nowhere, so that's not a user base. (Maybe if the Surface Phone materializes, it will be worth revisiting.)
Smartphones are a logical place for voice input -- typing and texting on phones is challenging and annoying, making the annoyances of dealing with a voice interface a reasonable tradeoff. And talking and listening to a phone is theoretically safer than attempting to look at one while driving. There are more than a billion Android smartphones out there, making Google Assistant an automatic player in the voice assistant game. (The inability of Siri to break out as a voice platform is probably more of a strategic concern for Apple than Cortana's position is for Microsoft.)
When it comes to voice-enabled speakers like the Amazon Echo, voice isn't just a competitive interface choice -- it's the only option in most cases. While Amazon is starting from a small base of maybe 20 to 30 million Echo devices sold to date, the company has all the momentum and a lot of industry partner enthusiasm.
Cortana's user base for now is PCs, and when it comes to voice input, it's not a great place to be. The keyboard and mouse/trackpad are an awesome combination -- voice has to get very, very good before it can ever displace those very mature inputs for a user seated in front of a laptop or PC. It's for the same reason that Alexa integration with PCs may be less promising than the PC OEMs make it out to be.
Microsoft's virtual assistant ambitions are bigger than the PC base; in fact, they're bigger than Cortana.
The PC user base is only part of Microsoft's market, and it's a shrinking part. As the company redefines itself as a cloud company, one of its real strengths is its deep history with the enterprise development community and its experience at enabling that community.
Microsoft's official statement about discontinuing the Cortana-Dynamics 365 public preview provides a clear example of the strategy in action:
We are working to deliver a robust and scalable digital assistant experience across all of our Dynamics 365 offerings. This includes natural language integration for customers and partners across multiple channels including Cortana. To that end, we are discontinuing the current Cortana integration preview feature that was made available for Dynamics 365 and we are focusing on building a new long term intelligent solution experience, which will include Cortana digital assistant integration.
Getting developers to use Azure services for voice recognition, chatbots, translation, machine learning and artificial intelligence are all strategic plays for Microsoft. Expect the company to keep working to develop first-rate user experiences that evolve the gimmicky aspects of Cortana's personality into a better and better virtual assistant interface for unlocking deeper business value from more and more of Microsoft's advanced cloud services.
Bad start to 2018 or not, Microsoft needs to keep a hand in virtual assistant technologies. As long as that's the case, Cortana will probably continue her role as the public face of that broader and deeper effort.
Posted by Scott Bekker on 01/16/2018 at 3:06 PM0 comments
Microsoft SharePoint users surged into cloud deployments in 2017, according to a new survey.
"The SharePoint and Office 365 Industry Survey" released this week by SharePoint tools suppliers Sharegate, Hyperfish and Nintex included responses from about 450 IT professionals and SharePoint administrators. What makes the survey interesting is that the same three companies surveyed a random sample of their combined client pools in 2016, as well, providing lots of data points for comparison.
There was a triple-digit increase -- 167 percent -- in SharePoint Online deployments from 2016 to 2017. While only 21 percent of respondents in 2016 had SharePoint Online deployed, that number soared to 56 percent in 2017. Even though that means that more than half of companies had SharePoint Online deployed, a lot of them were also still running on-premise SharePoint deployments in parallel.
Yet another data point in the survey shows more and more users trusting their entire SharePoint workload to the cloud. In 2016, one-fifth of users had SharePoint deployed exclusively online. A year later, that number was nearly a third (31 percent). At the same time, hybrid environments (a mix of SharePoint Online and on-premises SharePoint deployments) dropped by 7 percentage points to 34 percent and on-premises-only environments dropped by 2 percentage points to 35 percent in 2017.
The shift to the cloud in SharePoint is mirrored on the Active Directory (AD) side in the vendor survey. In 2016, a very slight majority of AD deployments involved on-premises AD (51 percent). But in 2017, that number fell to 42 percent, while a mix of on-premises and Azure AD jumped 3 percentage points to 34 percent and pure Azure AD deployments rose 4 percentage points to 16 percent.
The survey also reveals the relative share of the last six on-premises versions of SharePoint, dating all the way back to SharePoint 2001, although that version and SharePoint 2003 are present in low enough numbers to make any conclusions about the trends on those editions statistically questionable.
Among the newer versions, the only one gaining significant share is the most recent, SharePoint 2016, which saw a 67 percent increase in deployments from 2016 to 2017. While impressive, it's gaining share at a much lower rate than SharePoint Online/Office 365 and from a smaller base. SharePoint 2016 ended 2017 with a presence in 25 percent of respondents' shops.
Holding steady and maintaining the largest share of any edition, including SharePoint Online, is SharePoint 2013. Deployed at 66 percent of respondents' sites, SharePoint 2013 won't maintain its lead through 2018 if SharePoint Online continues its momentum.
For 2017, SharePoint Online seemed to be taking most of its share from SharePoint 2007, which dropped 2 percentage points to 18 percent, and especially from SharePoint 2010, which dropped 8 percentage points to 40 percent.
As Office 365 deployments continue to gallop ahead, there is little reason to suspect that SharePoint Online's share of overall SharePoint workloads won't continue to increase. The question is how fast.
As befits a survey fielded by tools vendors, a statement accompanying the data points out that obstacles remain for those still moving to SharePoint Online.
"The move to the cloud is not always as easy as it sounds. Microsoft has released a content migration tool to help customers leave SharePoint 2010 and 2013, but it just isn't enough. Here at Sharegate, we still see a large number of customers leveraging our tools to migrate while keeping their existing site structure and objects," said Benjamin Niaulin, Microsoft Regional Director & Product Advisor at Sharegate.
Among the challenges are ongoing concerns about security, cost constraints, time constraints and difficulties in migrating SharePoint customizations from on-premises to online.
This survey says progress to the cloud in 2017 was rapid. The question for 2018 will be whether that pace can continue. Were we looking at low-hanging fruit, easy wins and pilot projects that could stall slightly this year? Or was it an early majority shift that could bring nearly half of the SharePoint customer base exclusively into the cloud by year's end?
Posted by Scott Bekker on 01/10/2018 at 2:29 PM0 comments
Intel will release updates for the Meltdown and Spectre vulnerabilities by the end of January for all chips released in the last five years, CEO Brian Krzanich said Monday.
"For our processors and products introduced in the past five years, Intel expects to issue updates for more than 90 percent of them within a week, and the remaining by the end of January," Krzanich said.
His comments came at the start of his keynote Monday night to kick off the CES industry conference in Las Vegas. Facing by far the biggest security crisis since he took over as CEO in May 2013, Krzanich used the first two minutes of the keynote to discuss the security issues before pivoting to a more standard, future-oriented keynote focused on Intel's technologies for artificial intelligence and virtual reality.
Reports emerged last week that Intel and its hardware, operating system and other industry partners were working on patches for a major vulnerability in processors that could allow an attacker to collect sensitive data from computing devices that were working as designed. Intel confirmed and elaborated on the vulnerabilities in a series of public statements last week.
As he thanked industry partners for their speed and effort to release patches, Krzanich showed a slide with statements from those companies about how systems had already been patched. One statement on the slide from Amazon noted, "This is a vulnerability that has existed for more than 20 years in modern processor architectures like Intel, AMD, and ARM across servers, desktops, and mobile devices."
Krzanich's comments Monday did not address whether Intel planned to release updates for products that were more than 5 years old.
His CES comments were also less emphatic than Intel's public statements from last week in downplaying the possibility of performance hits from the patches, although that lack of emphasis could have been simply an effort to get on with the main keynote.
"We believe the performance impact of these updates is highly workload-dependent. Now as a result, we expect some workloads may have a larger impact than others so we'll continue working with the industry to minimize the impact on those workloads over time," Krzanich said Monday.
Previous Intel statements had added that the performance impact for the average computer user "should not be significant," and the company also released partner statements from Apple, Microsoft, Amazon and Google describing the impact with words like "not...meaningful," "not...noticeable," "no measurable reduction" and "negligible impact."
Krzanich encouraged users to apply updates as soon as they become available, and said the exploits don't appear to have been used maliciously yet. "As of now we have not received any information that these exploits have been used to obtain customer data, and we're working tirelessly on these issues to ensure it stays that way," he said.
Posted by Scott Bekker on 01/09/2018 at 8:40 AM0 comments
Reports have been bubbling up this week that vendors and open source teams are hustling under embargo to fix a major security flaw affecting Intel processors over the last decade. The rumored software fix could seriously slow down both personal systems and public clouds.
Here's the top of The Register's report from Tuesday night:
A fundamental design flaw in Intel's processor chips has forced a significant redesign of the Linux and Windows kernels to defang the chip-level security bug.
Programmers are scrambling to overhaul the open-source Linux kernel's virtual memory system. Meanwhile, Microsoft is expected to publicly introduce the necessary changes to its Windows operating system in an upcoming Patch Tuesday: these changes were seeded to beta testers running fast-ring Windows Insider builds in November and December.
Crucially, these updates to both Linux and Windows will incur a performance hit on Intel products. The effects are still being benchmarked, however we're looking at a ballpark figure of five to 30 per cent slow down, depending on the task and the processor model. More recent Intel chips have features -- such as PCID -- to reduce the performance hit. Your mileage may vary.
The next Patch Tuesday is Jan. 9. Microsoft also sent out warnings to some users that their Azure Virtual Machines would undergo an unusual reboot for security and maintenance on Jan. 10, and Amazon Web Services (AWS) e-mailed users of a maintenance reboot on Jan. 5-6, The Register noted. Officially, all the vendors are declining comment.
Patch Tuesdays are always mark-the-date events for IT, but this flaw is looking more like an all-hands-on-deck situation -- both for the security issues and then for the potential of subsequent and permanent performance problems.
UPDATE: Intel released its first statement on the issue Wednesday afternoon, confirming a serious security problem and a fix timeframe for next week, but pushing back partially on the performance hit and on reports that the problem only affected Intel chips. Here's the statement:
Intel Responds to Security Research Findings
Intel and other technology companies have been made aware of new security research describing software analysis methods that, when used for malicious purposes, have the potential to improperly gather sensitive data from computing devices that are operating as designed. Intel believes these exploits do not have the potential to corrupt, modify or delete data.
Recent reports that these exploits are caused by a "bug" or a "flaw" and are unique to Intel products are incorrect. Based on the analysis to date, many types of computing devices -- with many different vendors' processors and operating systems -- are susceptible to these exploits.
Intel is committed to product and customer security and is working closely with many other technology companies, including AMD, ARM Holdings and several operating system vendors, to develop an industry-wide approach to resolve this issue promptly and constructively. Intel has begun providing software and firmware updates to mitigate these exploits. Contrary to some reports, any performance impacts are workload-dependent, and, for the average computer user, should not be significant and will be mitigated over time.
Intel is committed to the industry best practice of responsible disclosure of potential security issues, which is why Intel and other vendors had planned to disclose this issue next week when more software and firmware updates will be available. However, Intel is making this statement today because of the current inaccurate media reports.
Check with your operating system vendor or system manufacturer and apply any available updates as soon as they are available. Following good security practices that protect against malware in general will also help protect against possible exploitation until updates can be applied.
Intel believes its products are the most secure in the world and that, with the support of its partners, the current solutions to this issue provide the best possible security for its customers.
Posted by Scott Bekker on 01/03/2018 at 4:02 PM0 comments
Yes, we know most passwords are lame, and we've known it for years. A look at the worst passwords of 2017 confirms the depressing reality:
This particular gallery of sad passwords comes from SplashData's seventh annual list of the 100 worst passwords. The 2017 list was based on 5 million passwords leaked in 2017, not including those from the Yahoo e-mail breach or from adult sites.
A few items on the list changed very little from previous lists by SplashData, a provider of password management software and services. The numeric entries are pretty similar; "123456" held the top spot last year. Only "123456789" is a new number in the Top 10, possibly due to more password-creation filters requiring more than eight characters. The password "password" retained the No. 2 spot.
Some of the new passwords conform to ideas that were in the air in 2017 -- "monkey," "starwars," "freedom" and "trustno1," for example. It's a useful reminder that while we may think of a password as clever in relation to our Dunbar's number of about 150 friends and acquaintances, they're probably not unique when it comes to the hundreds of millions of English-speaking Internet users. And standalone dictionary words -- as in, words not part of a passphrase -- are a password no-no, anyway.
Other entries in the top 100 reflect the utter frustration users have with being required to enter yet another password on yet another site. The entry "password" itself is partly an illustration of that, along with "whatever" and "blahblah." A new entry, "letmein," could be another. A quartet of profane passwords -- "f***you," "a**hole," "biteme," "pu**y" (asterisks mine) -- express that frustration in a pure, crass form.
As a matter of fact, nearly every password in the top 100 could arguably fit into the category of users saying, enough! "I have to create a user name and password to order this pizza? Fine: password." "I have to create a username/password to download this resource that might or might not have any value? Fine: 123456."
One study released in 2016 found that the average user had 27 discrete online log-ins. Others have put the number of accounts people have associated with individual e-mail addresses as high as 130.
While the SplashData list and others like it pull from the lowest-common-denominator passwords -- the ones where users did the absolute least they could do -- there are other reasons we're bad at passwords. For example, sites that don't tell you their password rules until they reject your first attempt. Sites that won't allow a passphrase of words separated by spaces. Sites that won't let you paste in the super-secure passwords generated by a password manager. It also doesn't help that the guy who came up with the rules for creating passwords now admits from retirement that he believes his suggestions were somewhat misguided.
What the list really points to is the fact that passwords are broken. Microsoft highlighted the issue with a long article on Dec. 26 about all the ways it's working to fix log-on processes by eliminating passwords. Components of the effort include Windows Hello (the identity technology built into Windows 10 for use with biometric sensors), the Microsoft Authenticator App, and the company's participation in the FIDO (Fast IDentity Online) Alliance developing open standards for authentication.
We'll watch those efforts with great interest throughout 2018. We'll have little hope that bad password lists will be less newsworthy by January 2019 or even January 2020.
Posted by Scott Bekker on 01/02/2018 at 11:31 AM0 comments
Ray Ozzie was a quiet presence in Redmond, but he left deep footprints throughout Microsoft's global operation that will last for years.
Ozzie is stepping down as chief software architect and preparing to retire, according to an employee e-mail that Microsoft CEO Steve Ballmer sent out Monday.
"He will remain with the company as he transitions the teams and ongoing strategic projects within his organization -- bringing the great innovations and great innovators he's assembled into the groups driving our business," Ballmer wrote. "Following the natural transition time with his teams but before he retires from Microsoft, Ray will be focusing his efforts in the broader area of entertainment where Microsoft has many ongoing investments."
Ozzie joined Microsoft in 2005 when the company bought Groove Networks and took on Bill Gates' secondary title of chief software architect a year later as part of the Gates' retirement transition plan. Ozzie was said to have been most comfortable working in small groups and sharing ideas on a whiteboard, and he reportedly suffered from stage fright before large crowds. He made a huge splash early with the Internet Services Disruption memo that he wrote and that Gates forwarded to the rest of the company. But since then, except for a few high-profile magazine profiles and a handful of speeches and interviews, he has largely faded from the public eye.
With Ozzie's history in collaboration, especially with Lotus Notes and Groove Networks, some look for his effect in Microsoft's collaboration technologies, and find his influence wanting. But that misses Ozzie's main task, which was to help Microsoft bridge the gap from 1990s dominance to 21st century relevance. Ozzie found that bridge in the cloud and worked relatively quietly, but steadily, on the project.
Ozzie's largest and most tangible footprints are the massive datacenters that Microsoft has been building since 2007. Media reports described Ozzie poring over reports on available electricity and other factors in choosing sites for the facilities. Meanwhile, Microsoft has received notice for innovations in datacenter design around cooling and power consumption. So far, Microsoft now has mega-datacenters costing about $500 million each in Quincy, Wash.; San Antonio, Texas; Dublin, Ireland; and Chicago. Officials in Virginia recently announced another facility coming in the southern Virginia lakeside town of Boydton.
Smaller Ozzie footprints will be evident in the Windows Azure Platform Appliance containers, a strategic advance which could help Microsoft spread its cloud to organizations around the world where laws, regulation or policy require data to reside inside national borders or organizational walls.
Ozzie's other footprints are evident in the array of cloud services Microsoft now offers across nearly its entire product base. Ballmer acknowledged as much in his company-wide e-mail.
"As a company, we've accomplished much in the past five years as we look at the cloud and services. Windows Live now serves as a natural web-based services complement to both Windows and Office. SharePoint and Exchange have now decidedly embraced the cloud. And by conceiving, incubating and shepherding Windows Azure, Ray helped ensure we have a tremendously rich platform foundation that will enable app-level innovation across the company and by customers for years to come."
"With our progress in services and the cloud now full speed ahead in all aspects of our business, Ray and I are announcing today Ray's intention to step down from his role as chief software architect," Ballmer wrote.
Ozzie's was nothing less than a visionary transformation of Microsoft, fully in line with the title of chief software architect. Ballmer says the position isn't being refilled.
The time for setting the cloud vision and planning the datacenter infrastructure to support the vision is passed. Ozzie's finished that job, and, frankly, given his personality, he's not the right person for the next part – evangelizing and selling Microsoft's cloud. Ballmer has been taking up that baton aggressively .See his speeches in March ("we're all in"), at the Microsoft Worldwide Partner Conference ("Oh, cloud") and this month ("cloud, cloud, cloud, cloud, cloud").
To paraphrase the quote famously attributed to Ben Franklin about a republic, Ray Ozzie has given Microsoft a potential leadership position in cloud computing, if Microsoft can keep it.
Posted by Scott Bekker on 10/19/2010 at 1:23 PM0 comments
Windows 1.0 got off to its auspicious start on Thursday Nov. 10, 1983, at the Plaza Hotel in New York City. Invitations to the launch were sent to the press in a box with a squeegee. The header read: "For a clear view of what's new in microcomputer software please join Microsoft and 18 microcomputer manufactures for a press conference…"
But, like many versions of Windows that would follow it, the first release didn't ship until two years after that fateful press conference, leading many to refer to it as "vapor-ware." Finally, Microsoft released Windows 1.0 in November of 1985 at Comdex.
When it comes to Windows 1.0, Microsoft prefers not to look that far back and has no apparent plans to celebrate its pending 25th anniversary. When wereached out to Microsoft to talk about its first rendition of Windows, the company declined to make anyone who was there at the time available. "We are focusing our anniversary efforts on the Windows 7 first birthday so unfortunately we won't be able to provide a briefing from someone [from] the Windows 1.0 days," a spokeswoman for Microsoft said in declining our request.
Of course, that anniversary is coming next month on Oct. 22. But we want to hear your recollections of Windows 1.0. Drop me a line at email@example.com.
Posted by Scott Bekker on 09/10/2010 at 1:23 PM0 comments
The markets and the Fed aren't the only ones saying the recovery is slowing enough to cause concern. Warning signs are flashing all through the small business and IT markets.
The National Federation of Independent Business (NFIB) released results of its latest Index of Small Business Optimism on Tuesday. See the full PDF here.
The survey was conducted in July and the results don't show much optimism. The index lost 0.9 points in the July run compared to June for a reading of 88.1. According to the report's summary, "The persistence of Index readings below 90 is unprecedented in survey history." NFIB has been running the survey quarterly since 1973 and monthly since 1986.
"The performance of the economy is mediocre at best, given the extent of the decline over the past two years. Pent up demand should be immense but it is not triggering a rapid pickup in economic activity. Ninety percent of the decline this month resulted from deterioration in the outlook for business conditions in the next six months. Owners have no confidence that economic policies will 'fix' the economy," report authors William Dunkelberg and Holly Wade wrote.
Other findings from the survey are that hiring plans are historically weak, capital spending plans are near the record low set in December 2009 and profit trends are worsening.
Researchers at Ovum reported Tuesday that the number of contracts in the IT service sector increased in the second quarter -- but hold the applause there. Despite the 14 percent sequential increase in deals from 401 in Q1 to 457 in Q2, the total contract value (TCV) of those deals also fell by 14 percent to $30.8 billion.
In a statement, Ovum analyst Ed Thomas indicated that IT service providers dealing with the public sector were faring slightly better than their private-sector counterparts.
"Public sector demand remained steady, particularly in the U.S., which accounted for more than 90 percent of the market's quarterly TVC. This was good news for vendors with a major focus on the U.S. government sector, notably General Dynamics, Lockheed Martin and SAIC," Thomas said. "Concerns remain about the scale of outsourcing in the private sector, where TCV for Q2 slipped to only $10 billion as clients shied away from signing large deals."
In a report released earlier this month on worldwide IT spending, IDC reported that first half spending exceeded the analyst firm's expectations and raised spending forecasts for the full year to $1.51 trillion, a 6 percent increase over 2009. By segment, the forecasts are for hardware growth of 11 percent, software growth of 4 percent and services growth of 2 percent. However, the firm tempered its enthusiasm with concerns about the global economy.
"We stand in the middle of two powerful and opposing forces," wrote IDC analyst Stephen Minton. "On the one hand, the very real pent-up demand for new IT investment, which has driven the solid recovery in the first half of 2010 and which will hopefully continue into 2011. On the other hand, the potential loss of confidence in a global economy which remains extremely vulnerable to any further escalation of the European debt crisis or a deterioration in the U.S. stock market."
What are you seeing? Drop me a line at firstname.lastname@example.org.
Posted by Scott Bekker on 08/11/2010 at 1:23 PM0 comments
Another Microsoft Worldwide Partner Conference is in the bag. Here are 11 key takeaways from the 2010 WPC:
1. Microsoft wants partners to be "all-in" on the cloud. Nearly everything was about cloud computing. That was a little weird for partners coming in from countries where BPOS and other offerings haven't rolled out yet, but pretty compelling for U.S. partners.
2. Keep an eye on the Windows Azure Appliance. The 900-server, private cloud enclosures are supposed to be coming this year from HP, Dell and Fujitsu -- extending Microsoft's cloud story.
3. Dynamics CRM Online. Margins jump to 40 percent in year one, and 6 percent recurring -- a huge bump from the old 18/6 mix. The offer is only guaranteed to be in place for a year. At the same time, partners are getting 250 Dynamics CRM Online seats for internal use.
4. Cloud Pack Essentials. A quick and dirty set of tools for partners to start moving their business onto the cloud.
5. Cloud Accelerate. A new badge to help born-on-the-cloud partners stand out.
6. Steve Ballmer seemed down. Kevin Turner was at the top of his aggressive game. Outgoing WPG CVP Allison Watson seemed wistful. New Worlwide Partner Group Corporate VP Jon Roskill was approachable.
7. Full speed ahead on the Microsoft Partner Network. New channel chief Roskill has no plans to pause the implementation. New benefits and requirements go online in October, barring technical complications.
8. Gold is back, sort of. The new Gold Certified Partner level will be out when MPN goes into full effect, but the Competencies and Advanced Competencies have been renamed Silver Competencies and Gold Competencies.
9. Microsoft is eyeing MSPs. With Windows InTune and future scaled-down Azure appliances, Microsoft is paying attention to the managed service provider market.
10. The heavy layoffs just ahead of WPC caused scheduling turmoil for partners and vendors, many of whose contacts were suddenly gone.
11. Nonetheless, partner enthusiasm was pretty high, with many partners telling us Microsoft seemed to have its mojo back. Partner attendance was huge at a reported 9,300 out of about 14,000 total attendees.
Posted by Scott Bekker on 07/19/2010 at 1:23 PM0 comments
The IAMCP, which now stands for the International Association of Microsoft Channel Partners, is coming off its first national meeting, held last month in regional offices and remotely throughout the country. The gathering featured a keynote from Cindy Bates, Microsoft vice president of U.S. Partner Strategy. As one of the top two Microsoft partner executives nationally, the Bates keynote was a good vote of confidence for the IAMCP's first national event where my colleague Jeff Schwartz attended the New York presentation (see his report).
Similarly, the New York IAMCP chapter landed a keynote from Microsoft Chief Operating Officer Kevin Turner last October. That's an impressive amount of love from one of Microsoft CEO Steve Ballmer's direct reports.
Meanwhile, the Microsoft Worldwide Partner Group has been heavily engaged with the Washington, D.C. chapter of the IAMCP in planning for the Worldwide Partner Conference there in July.
Now the IAMCP is announcing a new engagement model with Microsoft's U.S. Partner Group. In the U.S. IAMCP May newsletter that went out June 3, the organization announced, "The Microsoft U.S. Partner Team will be launching a new IAMCP engagement model framework outlining prescriptive guidance on how Microsoft will support IAMCP chapters across the US."
The engagement model will come in two tiers. Ten of the 35 chapters of the U.S. IAMCP will get what is called Core Coverage, under which they will be assigned a Microsoft Engagement Team. The team consists of an Area Partner Territory Manager, a Local Engagement Team Business Development Manager and one Field SMB Marketing Manager.
The other 25 U.S. chapters will get Extended Coverage, which will involve a smaller Microsoft Engagement Team – an Area Partner Territory Manager and an SMB Marketing Manager – working with the three IAMCP regional leads. The regional leads are Howard Cohen, Eastern Region; Richard Losciale, Central Region; and Marc Hoppers, Western Region. According to the IAMCP newsletter statement, the extended coverage will have "an emphasis on communications support over in-person meetings and presentations."
Cohen, who is also the Communications Chair for the U.S. IAMCP Board, said in an interview that IAMCP will choose which 10 chapters qualify for core coverage. "It's a combination of proximity to a Microsoft office and the size and resourcefulness of the chapters," Cohen said. Those decisions will be made sometime before the Microsoft-IAMCP engagement model launches next quarter.
The new model arose from a mutually recognized reduction in field engagement between IAMCP and Microsoft that started about 18 months ago, when the recession was at its worst.
"Up until about a year and a half ago, field engagement was terrific. In addition to the PAMs managing managed partners, there was a Partner Community Manager working with the IAMCP chapter, as well as Area Sales Managers," Cohen said.
"Over the last year and a half, all of those people who were partner-facing were really turned customer-facing," he added. "It became more and more difficult to do fundamental things, to work with Microsoft tactically to get things done. Even for the IAMCP chapter, which was usually the alternative that people would turn to when they couldn't get traction with Microsoft, it was even difficult for us to get traction.".
The problems weren't universal to all geographies, and the IAMCP began discussions with Microsoft several months ago in a project called Consistent Touch, Cohen said.
"We're very happy about this. This is a real recognition that the relationship that we worked for over the years has really worked and is really delivering results for our members," Cohen said.
There's been a lot of concern among partners that the new Microsoft Partner Network (MPN) favors large partners with dozens of engineers at the expense of the smaller shops that make up the bulk of Microsoft's massive channel. Of special concern is the MPN requirement effective in October that employees certified to qualify a company for an Advanced Competency can not be used to qualify the company for any other Advanced Competencies.
The increased engagement with IAMCP, and the attention to the partners of all sizes that the organization represents, is a solid step on Microsoft's part to do right by its partner community. It also means that if you're feeling frustrated by your interactions, or lack thereof, with Microsoft, it may be a good time to join the IAMCP.
As for the name, the IAMCP has long been known as the International Association of Microsoft Certified Partners. The word Certified is now officially outdated as the Certified and Gold Certified levels of the Microsoft Partner Program officially switch off when the MPN goes fully live in October. While the organization's legal name "IAMCP," is unaffected, the group has changed its logo and Website references from "Certified" to "Channel."
Posted by Scott Bekker on 06/03/2010 at 1:23 PM0 comments
A shout out to our sister publication for government IT consultants, Washington Technology, which ran a piece this month about the famous Los Angeles-Google deal. Writer David Hubler goes into a lot of depth about Computer Sciences Corp.'s role, partnering with Google to implement the messaging system. The system is eventually supposed to cover 30,000 public employees. If the implementation is a success, it will be another major case study supporting a cloud mail system, as opposed to on-premise, like Microsoft Exchange, IBM Lotus or Novell GroupWise, which is the system the Google setup will replace. Of course, if it doesn't work properly...
Posted by Scott Bekker on 05/24/2010 at 1:23 PM0 comments
Big Blue made a major customer acquisition move today in buying Sterling Commerce, according to an analyst. IBM is buying the Dublin, Ohio-based electronic data interchange (EDI) software company from AT&T for $1.4 billion. Analyst Ray Wang told RCP's Jeffrey Schwartz that by processing large volumes of transactions between B2B trading partners, Sterling actually brings IBM a lot of high-value customers among large banks, telcos and retailers.
Posted by Scott Bekker on 05/24/2010 at 1:23 PM0 comments
The slow rollout of the new Microsoft Partner Network passed a milestone today with the launch of the new competency structure and the new Action Packs.
Any partner with a Microsoft competency and specialization under the old system was supposed to be automatically transitioned into a new competency, with an e-mail notification. For some partners, the new competency name won't be much of a change. For example, the Security Solutions competency with a specialization in Identity & Secure Access will now go by the competency name Identity and Security. The ISV competency goes to, wait for it, ISV. For others, though, the new competency name is a lot different. Partners with the competency/specialization combo of Information Worker Solutions/Office Solutions Development are now in the Portals and Collaboration competency.
Even for those with big changes in the name of their competency, the difference is purely between Microsoft and partners for now. All competency benefits stay the same until a wider set of changes in October. Similarly, partners are supposed to continue using their previous competency logos for now, as well.
The really controversial changes to the competency structure occur in October. At the same time as new benefits are launched, Microsoft will introduce the advanced competency structure. Small- to mid-size partners have been especially concerned about those changes, which will eliminate the Gold Certified Partner level and will require partners to have unique engineers dedicated to each competency for the advanced level. For example, a partner looking to get an advanced competency in both Business Intelligence and Data Platform won't be able to share engineers for both competencies. Not a big deal for the Avanades of the world, but a gating factor for five-to-20-person partner shops.
Also today, new subscription programs go into effect. Microsoft is ending the current Microsoft Action Pack Subscription (MAPS), a massive program with a huge and mostly adoring fan base. As of today, there are two new versions of the Action Pack: The Action Pack Solution Provider subscription and the Action Pack Development and Design subscription. Microsoft is also ending the much smaller but also highly regarded Empower for ISV programs.
Posted by Scott Bekker on 05/24/2010 at 1:23 PM0 comments
Former high-ranking Microsoft executive Maria Martinez has landed at Microsoft's archrival in the cloud CRM space, Salesforce.com, less than a year after she retired from Microsoft.
Martinez was announced Wednesday as executive vice president of Customers for Life, Salesforce.com's department dedicated, obviously enough, to customer retention. She'll report to Frank van Veenendaal, president of worldwide sales and services.
Martinez left Microsoft last July as corporate vice president of Microsoft Services, a position of special interest to large Microsoft partners. That role at Microsoft sets the company's services strategies, including how aggressively or gently Microsoft treats partners when going after consulting service business. The Microsoft post's responsibilities include management of Microsoft Consulting Services.
Microsoft filled Martinez' post immediately with Kathleen Hogan, who began her Microsoft career in 2003 in the partner-facing role of vice president of Customer and Partner Experience. Prior to joining Microsoft, Hogan was a partner at McKinsey & Co. in Silicon Valley and worked at Oracle Corp.
Martinez joins a company that is a poster child for cloud computing businesses and has enjoyed surging revenues, even during the recession. Salesforce.com first cracked the $1 billion in revenues mark in its 2009 fiscal year, which ended in January 2009, and reported revenue growth of 21 percent for fiscal year 2010 -- reaching $1.3 billion in revenues.
Those revenues are probably only slightly less than Microsoft's revenues across the entire Microsoft Dynamics line -- which includes not only cloud CRM but on-premise CRM and several lines of on-premise ERP (Microsoft bundles Dynamics revenues in with Office revenues in its financial statements, making direct comparisons difficult). But Microsoft isn't enjoying anywhere near the growth in business applications that Salesforce.com is reporting. For the nine months of Microsoft's current fiscal year, the company reported that its Dynamics revenues were down 1 percent.
Posted by Scott Bekker on 05/06/2010 at 1:23 PM0 comments
Partners often don't think of opportunity when it comes to Microsoft's management technologies that are branded under the System Center umbrella. But Microsoft is making a major marketing push to get partners involved with two System Center products that were released to manufacturing today. The products are System Center Essentials 2010 and Data Protection Manager 2010.
David Mills, a senior product manager at Microsoft, acknowledged to my colleague Lee Pender that Microsoft has more evangelizing to do with partners on the management side. "There are still a lot of partners who are not aware that Essentials is out there," Mills said. "There's a lot of noise in [the management] space." But Mills also said that because of the number of Microsoft partners and all the potential mid-market customers, the opportunity for partners to help those customers manage their networks is relatively huge.
Microsoft is making a sustained effort to get the word out to the channel ahead of the products' general availability. The effort included a Partner Readiness Week for System Center Essentials 2010 in late February. During that week, Microsoft offered five online training courses about SCE 2010 and DPM 2010.
In a webcast last week on DPM and SCE (pronounced "ski"), RCP Executive Editor Jeff Schwartz talked to Dave Sobel, CEO of Evolve Technologies, a Washington, D.C. area Microsoft Certified Professional partner company. (Sobel's main claim to fame is his cover photo on the February issue of RCP magazine, but I may be biased.)
Sobel told Schwartz that he's already talking to customers about the products and sees a lot of opportunities for his firm.
"We can help them with the installation and the configuration and get [customers] all ready because we have the experience of doing it in multiple environments, and we can tailor it to their environments," Sobel said. "Then we leave them with the tools and help them when they need the partner for escalation on the parts they want assistance with or for the new project work as an add-on."
So far, Sobel said customers have been interested in having the management pieces that SCE 2010 and DPM 2010 provide, particularly the simplified management of the environment when they want to enable their people to do a little bit more, especially on the virtualization side.
"As more and more mid-market organizations are virtualizing, this is a great way for them to keep a handle on correct management of all of those moving parts. What we've been finding is that this is a great, simplified platform to let our customers dig in deep and manage their environment," Sobel said.
Customers are in two camps, Sobel said. Some already have management technology that SCE 2010, especially, could replace or consolidate. Others know they have problems, but they're not sure how to solve them.
"In general, most organizations have some kind of management technology. But often that can be a lot of management process where they run around and do inventory, or they've got these four or five little tools that aren't really a unified piece. Or they have some Tivoli and older management tools or they have some of the tools that come from the hardware vendors," Sobel said. "They're really looking for one that's more robust. I think it's a little bit more greenfield than it is displacement. But you do find that there are these homegrown mismatches of pieces that are doing the management already."
Stay tuned to RCP's May issue for a lot more detail on the partner opportunities in the SCE and DPM releases. In the meantime, check out the news story or listen to a replay of the webcast (Registration required).
Posted by Scott Bekker on 04/19/2010 at 1:23 PM0 comments
In my column for the April issue of RCP, "Looking out for the Little Server," I shared my concern that the stand-alone server category may suffer from benign neglect as the industry focuses on data-center blade designs that serve the cloud.
The column prompted a server solution specialist and Microsoft licensing expert with a major distributor, who asked that he not to be identified by name, to respond with some interesting observations:
"In reference to your column on April 1 (Looking out for the Little Server), I could not agree more. There are a number of players in this space that have convinced themselves that everyone in SMB will go to the cloud, for one reason or another. There are several factors that I'm seeing that push against that thought:
- VARs are only going to move their customers to the cloud if they are convinced that it's secure
- VARs are only going to move their customers to the cloud if they are convinced that they can continue to make money doing so
- VARs that derive any substantial portion of their business from hardware sales are going to need to see substantial financial up-tick to move to the cloud
- VARs will need to be convinced that their cloud providers are not going to take their customers direct
- SMB end-users will need to be convinced of the security of the cloud
- SMB end-users will need to be convinced of the stability and reliability of the cloud
- SMB end-users will need to be convinced that their data will be theirs, and only theirs, no matter whose servers it resides on.
"SBS is a great play, but it needs to be extended to meet more needs. There should be a telephony product that fits better than OCS. There should be a version of CRM that fits this space. There should definitely be an ERP solution that the average small business can use. Microsoft has the stack, but none of the parts know each other.
"If you want to see a nice play, and it makes me crazy to say it, you can take a look at Lotus Foundations Start and Foundations Reach. If you put those two together and add a ShoreTel VoIP system (made to integrate), there's a great SMB play there. I'd love to see Microsoft make a better solution than this (ShoreTel makes a system that integrates with MS CRM, too), but I don't see the current regime supporting that."
Posted by Scott Bekker on 04/19/2010 at 1:23 PM0 comments
In honor of Veterans Day tomorrow, I'll quote one of my favorite Joe Toye lines from my favorite HBO series, "Band of Brothers": "Where's the best chow? In Berlin."
You could rephrase the quote this week to "Where's the best e-mail server launch? In Berlin." Doesn't have the same punch, somehow, but a big deal for the Microsoft channel all the same.
Microsoft launched the newest version of its $1-billion-plus-per-year e-mail server along with Forefront Protection 2010 for Exchange Server at Tech-Ed Europe in Berlin Monday. Our own Kurt Mackie monitored all the webcasts and posted a lengthy story with a lot of the details about the Software plus Services and unified communications underpinnings of the server here.
The public release meant that Exchange 2010 and the new Forefront Protection product are available as trial downloads.
The Exchange Server 2010 launch comes in the middle of a wave of releases across the Microsoft stack, from Windows 7 a few weeks ago to SQL Server 2008 R2 next year.
Microsoft's been busy priming the channel for all these launches. Microsoft Business Division President Stephen Elop said yesterday that more than 45,000 partners are trained on Windows Server 2008 R2 and Exchange 2010. Several vendor partners announced services and solutions around Exchange 2010 in Berlin, including Advanced Micro Devices Inc., Avanade, Dell Inc., EMC Corp., Kaspersky Lab, Symantec Corp. and Unisys Corp.
In support of the product launch, Microsoft released two documents filled with cost-benefit data that could be useful to partners. The studies, done by Forrester Research and based on customer product trials, are "The Total Economic Impact of Microsoft Exchange 2010" and "The Total Economic Impact of Windows Server 2008 R2." Check out Kurt's story for a lot of handy links to resources.
Posted by Scott Bekker on 11/10/2009 at 1:23 PM1 comments
I've been scratching my head lately as I've compared the government's statistics for third quarter GDP growth against the corporate earnings of the IT titans. The U.S. GDP is supposed to be up 3.5 percent for Q3, while Microsoft, Tech Data and Ingram Micro all reported double-digit declines in revenues over roughly the same period.
But finally, some positive news out of the tech sector. IDC says worldwide PC microprocessor shipments in Q3 "rose substantially and to all-time record levels for a single quarter." The bounce in shipments is 23 percent quarter over quarter. Revenues for the same period are up 14 percent.
The story is more subtle than a 1:1 relationship with the U.S. economy. The chip growth doesn't actually line up with the U.S. economy, which IDC notes is still hamstrung by housing foreclosures and rising job losses. Many of these chips are being manufactured in China for sale in netbooks there, and IDC warns that the Chinese market is opaque -- inventory can hide in lots of places. But let's keep our fingers crossed that this could be the start of something good.
Posted by Scott Bekker on 11/10/2009 at 1:23 PM0 comments
The Microsoft Response Point SMB phone system has been in a holding pattern since Microsoft basically put it in maintenance mode in June, but a few companies have been moving forward with Response Point-based products. The latest is Quanta Computer, which released the RP310 Softphone for Microsoft Response Point Phone Systems today. Quanta is looking for resellers here.
Posted by Scott Bekker on 11/10/2009 at 1:23 PM0 comments