Mar
20
2019
--

Windows Virtual Desktop is now in public preview

Last year, Microsoft announced the launch of its Windows Virtual Desktop service. At the time, this was a private preview, but starting today, any enterprise user who wants to try out what using a virtual Windows 10 desktop that’s hosted in the Azure cloud looks like will be able to give it a try.

It’s worth noting that this is very much a product for businesses. You’re not going to use this to play Apex Legends on a virtual machine somewhere in the cloud. The idea here is that a service like this, which also includes access to Office 365 ProPlus, makes managing machines and the software that runs on them easier for enterprises. It also allows employers in regulated industries to provide their mobile workers with a virtual desktop that ensures that all of their precious data remains secure.

One stand-out feature here is that businesses can run multiple Windows 10 sessions on a single virtual machine.

It’s also worth noting that many of the features of this service are powered by technology from FSLogix, which Microsoft acquired last year. Specifically, these technologies allow Microsoft to give the non-persistent users relatively fast access to applications like their Outlook and OneDrive applications, for example.

For most Microsoft 365 enterprise customers, access to this service is simply part of the subscription cost they already pay — though they will need an Azure subscription and to pay for the virtual machines that run in the cloud.

Right now, the service is only available in the US East 2 and US Central Azure regions. Over time, and once the preview is over, Microsoft will expand it to all of its cloud regions.

Mar
20
2019
--

Portworx raises $27M Series C for its cloud-native data management platform

As enterprises adopt cloud-native technologies like containers to build their applications, the next question they often have to ask themselves is how they adapt their data storage and management practices to this new reality, too. One of the companies in this business is the four-year-old Portworx, which has managed to attract customers like Lufthansa Systems, GE Digital and HPE with its cloud-native storage and data-management platform for the Kubernetes container orchestration platform.

Portworx today announced that it has raised a $27 million Series C funding round led by  Sapphire Ventures and the ventures arm of Abu Dhabi’s Mubadala Investment Company. Existing investors Mayfield Fund and GE Ventures also participated, as well as new investors Cisco, HPE and NetApp, which clearly have a strategic interest in bringing Portworx’s storage offering to their own customers, too, and partnering with the company.

Portworx’s tools make it easier for developers to migrate data, create backups and recover them after an issue. The service supports most popular databases, including Cassandra, Redis and MySQL, but also other storage services. Essentially, it creates a storage layer for database containers or other stateful containers that your apps can then access, no matter where they run or where the data resides.

“As the cloud-native stack matures, Portworx’s leadership in the data layer is really what is highlighted by our funding,” Portworx CEO and co-founder Murli Thirumale told me. “We clearly have a significant number of customers, there is a lot of customer growth, our partner network is growing. What you are seeing is that within that cloud-native ecosystem, we have the maximum number of production deployments and that momentum is something we’re continuing to fuel and fund with this round.”

As Portworx CEO and co-founder Murli Thirumale told me, the company expanded its customer base by over 100 percent last year and increased its total bookings by 376 percent year-over-year. That’s obviously the kind of growth that investors want to see. Thirumale noted, though, that the company wasn’t under any pressure to raise at this point. “We were seeing such strong growth momentum that we knew we need the money to fuel the growth.” That means expanding the company’s sales force, especially internationally, as well as its support team to help its new customers manage their data lifecycle.

In addition to today’s funding round, Portworx also today announced the latest version of its flagship Portworx Enterprise platform, which now includes new data security and disaster recovery functions. These include improved role-based access controls that go beyond what Kubernetes traditionally offers (and that integrate with existing enterprise systems). The new disaster recovery tools now allow enterprises to make incremental backups to data centers that sit in different geographical locations. Maybe more importantly, Portworx now also lets users automatically save data in two nearby data centers zones as updates happen. That’s meant o enable use cases where zero data loss would be acceptable in the case of an outage. With this, a company could automatically backup data from a database that sits in Azure Germany Central and back it up to AWS Europe Frankfurt, for example.

Feb
20
2019
--

Google’s hybrid cloud platform is now in beta

Last July, at its Cloud Next conference, Google announced the Cloud Services Platform, its first real foray into bringing its own cloud services into the enterprise data center as a managed service. Today, the Cloud Services Platform (CSP) is launching into beta.

It’s important to note that the CSP isn’t — at least for the time being — Google’s way of bringing all of its cloud-based developer services to the on-premises data center. In other words, this is a very different project from something like Microsoft’s Azure Stack. Instead, the focus is on the Google Kubernetes Engine, which allows enterprises to then run their applications in both their own data centers and on virtually any cloud platform that supports containers.As Google Cloud engineering director Chen Goldberg told me, the idea here it to help enterprises innovate and modernize. “Clearly, everybody is very excited about cloud computing, on-demand compute and managed services, but customers have recognized that the move is not that easy,” she said and noted that the vast majority of enterprises are adopting a hybrid approach. And while containers are obviously still a very new technology, she feels good about this bet on the technology because most enterprises are already adopting containers and Kubernetes — and they are doing so at exactly the same time as they are adopting cloud and especially hybrid clouds.

It’s important to note that CSP is a managed platform. Google handles all of the heavy lifting like upgrades and security patches. And for enterprises that need an easy way to install some of the most popular applications, the platform also supports Kubernetes applications from the GCP Marketplace.

As for the tech itself, Goldberg stressed that this isn’t just about Kubernetes. The service also uses Istio, for example, the increasingly popular service mesh that makes it easier for enterprises to secure and control the flow of traffic and API calls between its applications.

With today’s release, Google is also launching its new CSP Config Management tool to help users create multi-cluster policies and set up and enforce access controls, resource quotas and more. CSP also integrates with Google’s Stackdriver Monitoring service and continuous delivery platforms.

“On-prem is not easy,” Goldberg said, and given that this is the first time the company is really supporting software in a data center that is not its own, that’s probably an understatement. But Google also decided that it didn’t want to force users into a specific set of hardware specifications like Azure Stack does, for example. Instead, CSP sits on top of VMware’s vSphere server virtualization platform, which most enterprises already use in their data centers anyway. That surely simplifies things, given that this is a very well-understood platform.

Feb
20
2019
--

Why Daimler moved its big data platform to the cloud

Like virtually every big enterprise company, a few years ago, the German auto giant Daimler decided to invest in its own on-premises data centers. And while those aren’t going away anytime soon, the company today announced that it has successfully moved its on-premises big data platform to Microsoft’s Azure cloud. This new platform, which the company calls eXtollo, is Daimler’s first major service to run outside of its own data centers, though it’ll probably not be the last.

As Daimler’s head of its corporate center of excellence for advanced analytics and big data Guido Vetter told me, the company started getting interested in big data about five years ago. “We invested in technology — the classical way, on-premise — and got a couple of people on it. And we were investigating what we could do with data because data is transforming our whole business as well,” he said.

By 2016, the size of the organization had grown to the point where a more formal structure was needed to enable the company to handle its data at a global scale. At the time, the buzz phrase was “data lakes” and the company started building its own in order to build out its analytics capacities.

Electric lineup, Daimler AG

“Sooner or later, we hit the limits as it’s not our core business to run these big environments,” Vetter said. “Flexibility and scalability are what you need for AI and advanced analytics and our whole operations are not set up for that. Our backend operations are set up for keeping a plant running and keeping everything safe and secure.” But in this new world of enterprise IT, companies need to be able to be flexible and experiment — and, if necessary, throw out failed experiments quickly.

So about a year and a half ago, Vetter’s team started the eXtollo project to bring all the company’s activities around advanced analytics, big data and artificial intelligence into the Azure Cloud, and just over two weeks ago, the team shut down its last on-premises servers after slowly turning on its solutions in Microsoft’s data centers in Europe, the U.S. and Asia. All in all, the actual transition between the on-premises data centers and the Azure cloud took about nine months. That may not seem fast, but for an enterprise project like this, that’s about as fast as it gets (and for a while, it fed all new data into both its on-premises data lake and Azure).

If you work for a startup, then all of this probably doesn’t seem like a big deal, but for a more traditional enterprise like Daimler, even just giving up control over the physical hardware where your data resides was a major culture change and something that took quite a bit of convincing. In the end, the solution came down to encryption.

“We needed the means to secure the data in the Microsoft data center with our own means that ensure that only we have access to the raw data and work with the data,” explained Vetter. In the end, the company decided to use the Azure Key Vault to manage and rotate its encryption keys. Indeed, Vetter noted that knowing that the company had full control over its own data was what allowed this project to move forward.

Vetter tells me the company obviously looked at Microsoft’s competitors as well, but he noted that his team didn’t find a compelling offer from other vendors in terms of functionality and the security features that it needed.

Today, Daimler’s big data unit uses tools like HD Insights and Azure Databricks, which covers more than 90 percents of the company’s current use cases. In the future, Vetter also wants to make it easier for less experienced users to use self-service tools to launch AI and analytics services.

While cost is often a factor that counts against the cloud, because renting server capacity isn’t cheap, Vetter argues that this move will actually save the company money and that storage costs, especially, are going to be cheaper in the cloud than in its on-premises data center (and chances are that Daimler, given its size and prestige as a customer, isn’t exactly paying the same rack rate that others are paying for the Azure services).

As with so many big data AI projects, predictions are the focus of much of what Daimler is doing. That may mean looking at a car’s data and error code and helping the technician diagnose an issue or doing predictive maintenance on a commercial vehicle. Interestingly, the company isn’t currently bringing to the cloud any of its own IoT data from its plants. That’s all managed in the company’s on-premises data centers because it wants to avoid the risk of having to shut down a plant because its tools lost the connection to a data center, for example.

Feb
14
2019
--

Peltarion raises $20M for its AI platform

Peltarion, a Swedish startup founded by former execs from companies like Spotify, Skype, King, TrueCaller and Google, today announced that it has raised a $20 million Series A funding round led by Euclidean Capital, the family office for hedge fund billionaire James Simons. Previous investors FAM and EQT Ventures also participated, and this round brings the company’s total funding to $35 million.

There is obviously no dearth of AI platforms these days. Peltarion focus on what it calls “operational AI.” The service offers an end-to-end platform that lets you do everything from pre-processing your data to building models and putting them into production. All of this runs in the cloud and developers get access to a graphical user interface for building and testing their models. All of this, the company stresses, ensures that Peltarion’s users don’t have to deal with any of the low-level hardware or software and can instead focus on building their models.

“The speed at which AI systems can be built and deployed on the operational platform is orders of magnitude faster compared to the industry standard tools such as TensorFlow and require far fewer people and decreases the level of technical expertise needed,” Luka Crnkovic-Friis, of Peltarion’s CEO and co-founder, tells me. “All this results in more organizations being able to operationalize AI and focusing on solving problems and creating change.”

In a world where businesses have a plethora of choices, though, why use Peltarion over more established players? “Almost all of our clients are worried about lock-in to any single cloud provider,” Crnkovic-Friis said. “They tend to be fine using storage and compute as they are relatively similar across all the providers and moving to another cloud provider is possible. Equally, they are very wary of the higher-level services that AWS, GCP, Azure, and others provide as it means a complete lock-in.”

Peltarion, of course, argues that its platform doesn’t lock in its users and that other platforms take far more AI expertise to produce commercially viable AI services. The company rightly notes that, outside of the tech giants, most companies still struggle with how to use AI at scale. “They are stuck on the starting blocks, held back by two primary barriers to progress: immature patchwork technology and skills shortage,” said Crnkovic-Friis.

The company will use the new funding to expand its development team and its teams working with its community and partners. It’ll also use the new funding for growth initiatives in the U.S. and other markets.

Feb
07
2019
--

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Feb
05
2019
--

Google’s still not sharing cloud revenue

Google has shared its cloud revenue exactly once over the last several years. Silence tends to lead to speculation to fill the information vacuum. Luckily there are some analyst firms who try to fill the void, and it looks like Google’s cloud business is actually trending in the right direction, even if they aren’t willing to tell us an exact number.

When Google last reported its cloud revenue, last year about this time, they indicated they had earned $1 billion in revenue for the quarter, which included Google Cloud Platform and G Suite combined. Diane Greene, who was head of Google Cloud at the time, called it an “elite business.” but in reality it was pretty small potatoes compared to Microsoft’s and Amazon’s cloud numbers, which were pulling in $4-$5 billion a quarter between them at the time. Google was looking at a $4 billion run rate for the entire year.

Google apparently didn’t like the reaction it got from that disclosure so it stopped talking about cloud revenue. Yesterday when Google’s parent company, Alphabet, issued its quarterly earnings report, to nobody’s surprise, it failed to report cloud revenue yet again, at least not directly.

Google CEO Sundar Pichai gave some hints, but never revealed an exact number. Instead he talked in vague terms calling Google Cloud “a fast-growing multibillion-dollar business.” The only time he came close to talking about actual revenue was when he said, “Last year, we more than doubled both the number of Google Cloud Platform deals over $1 million as well as the number of multiyear contracts signed. We also ended the year with another milestone, passing 5 million paying customers for our cloud collaboration and productivity solution, G Suite.”

OK, it’s not an actual dollar figure, but it’s a sense that the company is actually moving the needle in the cloud business. A bit later in the call, CFO Ruth Porat threw in this cloud revenue nugget. “We are also seeing a really nice uptick in the number of deals that are greater than $100 million and really pleased with the success and penetration there. At this point, not updating further.” She is not updating further. Got it.

That brings us to a company that guessed for us, Canalys. While the firm didn’t share its methodology, it did come up with a figure of $2.2 billion for the quarter. Given that the company is closing larger deals and was at a billion last year, this figure feels like it’s probably in the right ballpark, but of course it’s not from the horse’s mouth, so we can’t know for certain. It’s worth noting that Canalys told TechCrunch that this is for GCP revenue only, and does not include G Suite, so that would suggest that it could be gaining some momentum.

Frankly, I’m a little baffled why Alphabet’s shareholders actually let the company get away with this complete lack of transparency. It seems like people would want to know exactly what they are making on that crucial part of the business, wouldn’t you? As a cloud market watcher, I know I would, and if the company is truly beginning to pick up steam, as Canalys data suggests, the lack of openness is even more surprising. Maybe next quarter.

Feb
05
2019
--

BetterCloud can now manage any SaaS application

BetterCloud began life as a way to provide an operations layer for G Suite. More recently, after a platform overhaul, it began layering on a handful of other SaaS applications. Today, the company announced, it is now possible to add any SaaS application to its operations dashboard and monitor usage across applications via an API.

As founder and CEO David Politis explains, a tool like Okta provides a way to authenticate your SaaS app, but once an employee starts using it, BetterCloud gives you visibility into how it’s being used.

“The first order problem was identity, the access, the connections. What we’re doing is we’re solving the second order problem, which is the interactions,” Politis explained. In his view, companies lack the ability to monitor and understand the interactions going on across SaaS applications, as people interact and share information, inside and outside the organization. BetterCloud has been designed to give IT control and security over what is occurring in their environment, he explained.

He says they can provide as much or as little control as a company needs, and they can set controls by application or across a number of applications without actually changing the user’s experience. They do this through a scripting library. BetterCloud comes with a number of scripts and provides log access to give visibility into the scripting activity.

If a customer is looking to use this data more effectively, the solution includes a Graph API for ingesting data and seeing the connections across the data that BetterCloud is collecting. Customers can also set event triggers or actions based on the data being collected as certain conditions are met.

All of this is possible because the company overhauled the platform last year to allow BetterCloud to move beyond G Suite and plug other SaaS applications into it. Today’s announcement is the ultimate manifestation of that capability. Instead of BetterCloud building the connectors, it’s providing an API to let its customers do it.

The company was founded in 2011 and has raised more than $106 million, according to Crunchbase.

Jan
24
2019
--

Microsoft acquires Citus Data

Microsoft today announced that it has acquired Citus Data, a company that focused on making PostgreSQL databases faster and more scalable. Citus’ open-source PostgreSQL extension essentially turns the application into a distributed database and, while there has been a lot of hype around the NoSQL movement and document stores, relational databases — and especially PostgreSQL — are still a growing market, in part because of tools from companies like Citus that overcome some of their earlier limitations.

Unsurprisingly, Microsoft plans to work with the Citus Data team to “accelerate the delivery of key, enterprise-ready features from Azure to PostgreSQL and enable critical PostgreSQL workloads to run on Azure with confidence.” The Citus co-founders echo this in their own statement, noting that “as part of Microsoft, we will stay focused on building an amazing database on top of PostgreSQL that gives our users the game-changing scale, performance, and resilience they need. We will continue to drive innovation in this space.”

PostgreSQL is obviously an open-source tool, and while the fact that Microsoft is now a major open-source contributor doesn’t come as a surprise anymore, it’s worth noting that the company stresses that it will continue to work with the PostgreSQL community. In an email, a Microsoft spokesperson also noted that “the acquisition is a proof point in the company’s commitment to open source and accelerating Azure PostgreSQL performance and scale.”

Current Citus customers include the likes of real-time analytics service Chartbeat, email security service Agari and PushOwl, though the company notes that it also counts a number of Fortune 100 companies among its users (they tend to stay anonymous). The company offers both a database as a service, an on-premises enterprise version and the free open-source edition. For the time being, it seems like that’s not changing, though over time I would suspect that Microsoft will transition users of the hosted service to Azure.

The price of the acquisition was not disclosed. Citus Data, which was founded in 2010 and graduated from the Y Combinator program, previously raised more than $13 million from the likes of Khosla Ventures, SV Angel and Data Collective.

Jan
23
2019
--

AWS launches WorkLink to make accessing mobile intranet sites and web apps easier

If your company uses a VPN and/or a mobile device management service to give you access to its intranet and internal web apps, then you know how annoying those are. AWS today launched a new product, Amazon WorkLink, that promises to make this process significantly easier.

WorkLink is a fully managed service that, for $5 per month and per user, allows IT admins to give employees one-click access to internal sites, no matter whether they run on AWS or not.

After installing WorkLink on their phones, employees can then simply use their favorite browser to surf to an internal website (other solutions often force users to use a sub-par proprietary browser). WorkLink then goes to work, securely requests that site and — and that’s the smart part here — a secure WorkLink container converts the site into an interactive vector graphic and sends it back to the phone. Nothing is stored or cached on the phone and AWS says WorkLink knows nothing about personal device activity either. That also means when a device is lost or stolen, there’s no need to try to wipe it remotely because there’s simply no company data on it.

IT can either use a VPN to connect from an AWS Virtual Private Cloud to on-premise servers or use AWS Direct Connect to bypass a VPN solution. The service works with all SAML 2.0 identity providers (which is the majority of identity services used in the enterprise, including the likes of Okta and Ping Identity), and as a fully managed service, it handles scaling and updates in the background.

“When talking with customers, all of them expressed frustration that their workers don’t have an easy and secure way to access internal content, which means that their employees either waste time or don’t bother trying to access content that would make them more productive,” says Peter Hill, vice president of Productivity Applications at AWS, in today’s announcement. “With Amazon WorkLink, we’re enabling greater workplace productivity for those outside the corporate firewall in a way that IT administrators and security teams are happy with and employees are willing to use.”

WorkLink will work with both Android and iOS, but for the time being, only the iOS app (iOS 12+) is available. For now, it also only works with Safar, with Chrome support coming in the next few weeks. The service is also only available in Europe and North America for now, with additional regions coming later this year.

For the time being, AWS’s cloud archrivals Google and Microsoft don’t offer any services that are quite comparable with WorkLink. Google offers its Cloud Identity-Aware Proxy as a VPN alternative and as part of its BeyondCorp program, though that has a very different focus, while Microsoft offers a number of more traditional mobile device management solutions.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com