Apr
09
2019
--

Google’s hybrid cloud platform is coming to AWS and Azure

Google’s Cloud Services Platform for managing hybrid clouds that span on-premise data centers and the Google cloud is coming out of beta today. The company is also changing the product’s name to Anthos, a name that either refers to a lost Greek tragedy, the name of an obscure god in the Marvel universe or rosemary. That by itself would be interesting, but minor news. What makes this interesting is that Google also today announced that Anthos will run on third-party clouds, as well, including AWS and Azure.

“We will support Anthos and AWS and Azure as well, so people get one way to manage their application and that one way works across their on-premise environments and all other clouds,” Google’s senior VP for its technical infrastructure, Urs Hölzle, explained in a press conference ahead of today’s announcement.

So with Anthos, Google will offer a single managed service that will let you manage and deploy workloads across clouds, all without having to worry about the different environments and APIs. That’s a big deal and one that clearly delineates Google’s approach from its competitors’. This is Google, after all, managing your applications for you on AWS and Azure.

“You can use one consistent approach — one open-source based approach — across all environments,” Hölzle said. “I can’t really stress how big a change that is in the industry, because this is really the stack for the next 20 years, meaning that it’s not really about the three different clouds that are all randomly different in small ways. This is the way that makes these three cloud — and actually on-premise environments, too — look the same.”

Anthos/Google Cloud Services Platform is based on the Google Kubernetes Engine, as well as other open-source projects like the Istio service mesh. It’s also hardware agnostic, meaning that users can take their current hardware and run the service on top of that without having to immediately invest in new servers.

Why is Google doing this? “We hear from our customers that multi-cloud and hybrid is really an acute pain point,” Hölzle said. He noted that containers are the enabling technology for this but that few enterprises have developed a unifying strategy to manage these deployments and that it takes expertise in all major clouds to get the most out of them.

Enterprises already have major investments in their infrastructure and created relationships with their vendors, though, so it’s no surprise that Google is launching Anthos with more than 30 major hardware and software partners that range from Cisco to Dell EMC, HPE and VMware, as well as application vendors like Confluent, Datastax, Elastic, Portworx, Tigera, Splunk, GitLab, MongoDB and others.

Robin.io, a data management service that offers a hyper-converged storage platform based on Kubernetes, also tells me that it worked closely with Google to develop the Anthos Storage API. “Robin Storage offers bare metal performance, powerful data management capabilities and Kubernetes-native management to support running enterprise applications on Google Cloud’s Anthos across on-premises data centers and the cloud,” said Premal Buch, CEO of Robin.io.

Anthos is a subscription-based service, with the list prices starting at $10,000/month per 100 vCPU block. Enterprise prices will then be up for negotiation, though, so many customers will likely pay less.

It’s one thing to use a service like this for new applications, but many enterprises already have plenty of line-of-business tools that they would like to bring to the cloud as well. For them, Google is launching the first beta of Anthos Migrate today. This service will auto-migrate VMs from on-premises or other clouds into containers in the Google Kubernetes Engine. The promise here is that this is essentially an automatic process and once the container is on Google’s platform, you’ll be able to use all of the other features that come with the Anthos platform, too.

Google’s Hölzle noted that the emphasis here was on making this migration as easy as possible. “There’s no manual effort there,” he said.

Mar
28
2019
--

Microsoft gives 500 patents to startups

Microsoft today announced a major expansion of its Azure IP Advantage program, which provides its Azure users with protection against patent trolls. This program now also provides customers who are building IoT solutions that connect to Azure with access to 10,000 patents to defend themselves against intellectual property lawsuits.

What’s maybe most interesting here, though, is that Microsoft is also donating 500 patents to startups in the LOT Network. This organization, which counts companies like Amazon, Facebook, Google, Microsoft, Netflix, SAP, Epic Games, Ford, GM, Lyft and Uber among its close to 400 members, is designed to protect companies against patent trolls by giving them access to a wide library of patents from its member companies and other sources.

“The LOT Network is really committed to helping address the proliferation of intellectual property lawsuits, especially ones that are brought by non-practicing entities, or so-called trolls,” Microsoft  CVP and Deputy General Counsel Erich Andersen told me. 

This new program goes well beyond basic protection from patent trolls, though. Qualified startups who join the LOT Network can acquire Microsoft patents as part of their free membership and as Andersen stressed, the startups will own them outright. The LOT network will be able to provide its startup members with up to three patents from this collection.

There’s one additional requirement here, though: To qualify for getting the patents, these startups also have to meet a $1,000 per month Azure spend. As Andersen told me, though, they don’t have to make any kind of forward pledge. The company will simply look at a startup’s last three monthly Azure bills.

“We want to help the LOT Network grow its network of startups,” Andersen said. “To provide an incentive, we are going to provide these patents to them.” He noted that startups are obviously interested in getting access to patents as a foundation of their companies, but also to raise capital and to defend themselves against trolls.

The patents we’re talking about here cover a wide range of technologies as well as geographies. Andersen noted that we’re talking about U.S. patents as well as European and Chinese patents, for example.

“The idea is that these startups come from a diverse set of industry sectors,” he said. “The hope we have is that when they approach LOT, they’ll find patents among those 500 that are going to be interesting to basically almost any company that might want a foundational set of patents for their business.”

As for the extended Azure IP Advantage program, it’s worth noting that every Azure customer who spends more than $1,000 per month over the past three months and hasn’t filed a patent infringement lawsuit against another Azure customer in the last two years can automatically pick one of the patents in the program’s portfolio to protect itself against frivolous patent lawsuits from trolls (and that’s a different library of patents from the one Microsoft is donating to the LOT Network as part of the startup program).

As Andersen noted, the team looked at how it could enhance the IP program by focusing on a number of specific areas. Microsoft is obviously investing a lot into IoT, so extending the program to this area makes sense. “What we’re basically saying is that if the customer is using IoT technology — regardless of whether it’s Microsoft technology or not — and it’s connected to Azure, then we’re going to provide this patent pick right to help customers defend themselves against patent suits,” Andersen said.

In addition, for those who do choose to use Microsoft IoT technology across the board, Microsoft will provide indemnification, too.

Patent trolls have lately started acquiring IoT patents, so chances are they are getting ready to make use of them and that we’ll see quite a bit of patent litigation in this space in the future. “The early signs we’re seeing indicate that this is something that customers are going to care about in the future,” said Andersen.

Feb
24
2019
--

Microsoft announces an Azure-powered Kinect camera for enterprise

Today’s Mobile World Congress kickoff event was all about the next Hololens, but Microsoft still had some surprises up its sleeve. One of the more interesting additions is the Azure Kinect, a new enterprise camera system that leverages the company’s perennially 3D imaging technology to create a 3D camera for enterprises.

The device is actually a kind of companion hardware piece for Hololens in the enterprise, giving business a way to capture depth sensing and leverage its Azure solutions to collect that data.

“Azure Kinect is an intelligent edge device that doesn’t just see and hear but understands the people, the environment, the objects and their actions,” Azure VP Julia White said at the kick off of today’s event. “The level of accuracy you can achieve is unprecedented.”

What started as a gesture-based gaming peripheral for the Xbox 360 has since grown to be an incredibly useful tool across a variety of different fields, so it tracks that the company would seek to develop a product for business. And unlike some of the more far off Hololens applications, the Azure Kinect is the sort of product that could be instantly useful, right off the the shelf.

A number of enterprise partners have already begun testing the technology, including Datamesh, Ocuvera and Ava, representing an interesting cross-section of companies. The system goes up for pre-order today, priced at $399. 

Feb
07
2019
--

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Nov
19
2018
--

Microsoft acquires FSLogix to enhance Office 365 virtual desktop experience

Back in September, Microsoft announced a virtual desktop solution that lets customers run Office 365 and Windows 10 in the cloud. They mentioned several partners in the announcement that were working on solutions with them. One of those was FSLogix, a Georgia virtual desktop startup. Today, Microsoft announced it has acquired FSLogix. It did not share the purchase price.

“FSLogix is a next-generation app-provisioning platform that reduces the resources, time and labor required to support virtualization,” Brad Anderson, corporate VP for Microsoft Office 365 and Julia White, corporate VP for Microsoft Azure,  href=”https://blogs.microsoft.com/blog/2018/11/19/microsoft-acquires-fslogix-to-enhance-the-office-365-virtualization-experience/”>wrote in a joint blog post today.

When Microsoft made the virtual desktop announcement in September they named Citrix, CloudJumper, Lakeside Software, Liquidware, People Tech Group, ThinPrint and FSLogix as partners working on solutions. Apparently, the company decided it wanted to own one of those experiences and acquired FSLogix.

Microsoft believes by incorporating the FSLogix solution, it will provide a better virtual desktop experience for its customers by enabling better performance and faster load times, especially for Office 365 ProPlus customers.

Randy Cook, founder and CTO at FSLogix, said the acquisition made sense given how well the two companies have worked together over the years. “From the beginning, in working closely with several teams at Microsoft, we recognized that our missions were completely aligned. Both FSLogix and Microsoft are dedicated to providing the absolute best experience for companies choosing to deploy virtual desktops,” Cook wrote in a blog post announcing the acquisition.

Lots of companies have what are essentially dumb terminals running just the tools each employee needs, rather than a fully functioning standalone PC. Citrix has made a living offering these services. When employees come in to start the day, they sign in with their credentials and they get a virtual desktop with the tools they need to do their jobs. Microsoft’s version of this involves Office 365 and Windows 10 running on Azure.

FSLogix was founded in 2013 and has raised more than $10 million, according to data on Crunchbase. Today’s acquisition, which has already closed according to Microsoft, comes on the heels of last week’s announcement that the company was buying Xoxco, an Austin-based developer shop with experience building conversational bots.

Sep
25
2018
--

Chef launches deeper integration with Microsoft Azure

DevOps automation service Chef today announced a number of new integrations with Microsoft Azure. The news, which was announced at the Microsoft Ignite conference in Orlando, Florida, focuses on helping enterprises bring their legacy applications to Azure and ranges from the public preview of Chef Automate Managed Service for Azure to the integration of Chef’s InSpec compliance product with Microsoft’s cloud platform.

With Chef Automate as a managed service on Azure, which provides ops teams with a single tool for managing and monitoring their compliance and infrastructure configurations, developers can now easily deploy and manage Chef Automate and the Chef Server from the Azure Portal. It’s a fully managed service and the company promises that businesses can get started with using it in as little as thirty minutes (though I’d take those numbers with a grain of salt).

When those configurations need to change, Chef users on Azure can also now use the Chef Workstation with Azure Cloud Shell, Azure’s command line interface. Workstation is one of Chef’s newest products and focuses on making ad-hoc configuration changes, no matter whether the node is managed by Chef or not.

And to remain in compliance, Chef is also launching an integration of its InSpec security and compliance tools with Azure. InSpec works hand in hand with Microsoft’s new Azure Policy Guest Configuration (who comes up with these names?) and allows users to automatically audit all of their applications on Azure.

“Chef gives companies the tools they need to confidently migrate to Microsoft Azure so users don’t just move their problems when migrating to the cloud, but have an understanding of the state of their assets before the migration occurs,” said Corey Scobie, the senior vice president of products and engineering at Chef, in today’s announcement. “Being able to detect and correct configuration and security issues to ensure success after migrations gives our customers the power to migrate at the right pace for their organization.”

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

Microsoft Azure gets new high-performance storage options

Microsoft Azure is getting a number of new storage options today that mostly focus on use cases where disk performance matters.

The first of these is Azure Ultra SSD Managed Disks, which are now in public preview. Microsoft says that these drives will offer “sub-millisecond latency,” which unsurprisingly makes them ideal for workloads where latency matters.

Earlier this year, Microsoft launched its Premium and Standard SSD Managed Disks offerings for Azure into preview. These ‘ultra’ SSDs represent the next tier up from the Premium SSDs with even lower latency and higher throughput. They’ll offer 160,000 IOPS per second will less than a millisecond of read/write latency. These disks will come in sizes ranging from 4GB to 64TB.

And talking about Standard SSD Managed Disks, this service is now generally available after only three months in preview. To top things off, all of Azure’s storage tiers (Premium and Standard SSD, as well as Standard HDD) now offer 8, 16 and 32 TB storage capacity.

Also new today is Azure Premium files, which is now in preview. This, too, is an SSD-based service. Azure Files itself isn’t new, though. It offers users access to cloud storage using the standard SMB protocol. This new premium offering promises higher throughput and lower latency for these kind of SMB operations.

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

Microsoft wants to put your data in a box

AWS has its Snowball (and Snowmobile truck), Google Cloud has its data transfer appliance and Microsoft has its Azure Data Box. All of these are physical appliances that allow enterprises to ship lots of data to the cloud by uploading it into these machines and then shipping them to the cloud. Microsoft’s Azure Data Box launched into preview about a year ago and today, the company is announcing a number of updates and adding a few new boxes, too.

First of all, the standard 50-pound, 100-terabyte Data Box is now generally available. If you’ve got a lot of data to transfer to the cloud — or maybe collect a lot of offline data — then FedEx will happily pick this one up and Microsoft will upload the data to Azure and charge you for your storage allotment.

If you’ve got a lot more data, though, then Microsoft now also offers the Azure Data Box Heavy. This new box, which is now in preview, can hold up to one petabyte of data. Microsoft did not say how heavy the Data Box Heavy is, though.

Also new is the Azure Data Box Edge, which is now also in preview. In many ways, this is the most interesting of the additions since it goes well beyond transporting data. As the name implies, Data Box Edge is meant for edge deployments where a company collects data. What makes this version stand out is that it’s basically a small data center rack that lets you process data as it comes in. It even includes an FPGA to run AI algorithms at the edge.

Using this box, enterprises can collect the data, transform and analyze it on the box, and then send it to Azure over the network (and not in a truck). Using this, users can cut back on bandwidth cost and don’t have to send all of their data to the cloud for processing.

Also part of the same Data Box family is the Data Box Gateway. This is a virtual appliance, however, that runs on Hyper-V and VMWare and lets users create a data transfer gateway for importing data in Azure. That’s not quite as interesting as a hardware appliance but useful nonetheless.

more Microsoft Ignite 2018 coverage

Aug
13
2018
--

Microsoft stands up Azure Stack for government as JEDI contract looms

Microsoft announced today that it’s released Azure Stack for Azure Government at a time when it’s battling rivals at Amazon and other cloud companies for the massive winner-take-all $10 billion Pentagon cloud contract known as JEDI.

Azure Stack provides customers with a similar set of cloud services that they would get in the public cloud, but inside the cozy confines of the customer data center. For Azure cloud customers who are looking to manage across public and private environments, often referred to as a hybrid approach, it gives a common look and feel across both public and private.

“As a cornerstone of Microsoft’s hybrid cloud approach, consistency means government customers get the same infrastructure and services with Azure Stack as they do with Azure — the same APIs, DevOps tools, portal, and more,”Natalia Mackevicius, Program Director, Microsoft Azure Stack wrote in a blog post announcing the new program.

In addition, the company announced it had passed a third-party FedRamp certification. FedRamp is a government program that provides a standardized way for government procurement officials to assess cloud security.

“Azure Stack for Azure Government directly addresses many other significant challenges our top federal government customers face. This includes tough regulatory, connectivity and latency requirements,” Mackevicius, wrote in a blog post announcement.

While this product is geared for any government customer, this news could certainly be appealing to the Pentagon, which is looking for one vendor to rule them in its latest mega cloud RFP. While Microsoft wouldn’t comment on JEDI specifically because it’s in the midst of answering that RFP, the timing can’t be a coincidence.

Microsoft, along with other competitors including Oracle and IBM, have been complaining bitterly that the one-vendor contract process unfairly favors Amazon. These companies have recommended that the Pentagon go with a multi-vendor approach to prevent lock-in and take advantage of innovation across sellers. The complaints so far has fallen on deaf ears at the Pentagon.

Regardless, Microsoft is still battling hard for the massive contract and today’s release certainly bolsters their approach as they continue to fight to win the JEDI deal — and other government business.

Jul
26
2018
--

Amazon’s AWS continues to lead its performance highlights

Amazon’s web services AWS continue to be the highlight of the company’s balance sheet, once again showing the kind of growth Amazon is looking for in a new business for the second quarter — especially one that has dramatically better margins than its core retail business.

Despite now running a grocery chain, the company’s AWS division — which has an operating margin over 25 percent compared to its tiny margins on retail — grew 49 percent year-over-year in the quarter compared to last year’s second quarter. It’s also up 49 percent year-over-year when comparing the most recent six months to the same period last year. AWS is now on a run rate well north of $10 billion annually, generating more than $6 billion in revenue in the second quarter this year. Meanwhile, Amazon’s retail operations generated nearly $47 billion with a net income of just over $1.3 billion (unaudited). Amazon’s AWS generated $1.6 billion in operating income on its $6.1 billion in revenue.

So, in short, Amazon’s dramatically more efficient AWS business is its biggest contributor to its actual net income. The company reported earnings of $5.07 per share, compared to analyst estimates of around $2.50 per share, on revenue of $52.9 billion. That revenue number fell under what investors were looking for, so the stock isn’t really doing anything in after-hours, and Amazon still remains in the race to become a company with a market cap of $1 trillion alongside Google, Apple and Microsoft.

This isn’t extremely surprising, as Amazon was one of the original harbingers of the move to a cloud computing-focused world, and, as a result, Microsoft and Google are now chasing it to capture up as much share as possible. While Microsoft doesn’t break out Azure, the company says it’s one of its fastest-growing businesses, and Google’s “other revenue” segment that includes Google Cloud Platform also continues to be one of its fastest-growing divisions. Running a bunch of servers with access to on-demand compute, it turns out, is a pretty efficient business that can account for the very slim margins that Amazon has on the rest of its core business.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com