Jun
10
2019
--

Salesforce’s Tableau acquisition is huge, but not the hugest

When you’re talking about 16 billion smackeroos, it’s easy to get lost in the big number. When Salesforce acquired Tableau this morning for $15.7 billion, while it was among the biggest enterprise deals ever, it certainly wasn’t the largest.

There was widespread speculation that when the new tax laws went into effect in 2017, and large tech companies could repatriate large sums of their money stored offshore, we would start to see a wave of M&A activity, and sure enough that’s happened.

As Box CEO Aaron Levie pointed out on Twitter, it also shows that if you can develop a best-of-breed tool that knocks off the existing dominant tool set, you can build a multibillion-dollar company. We have seen this over and over, maybe not $15 billion companies, but substantial companies with multibillion-dollar price tags.

Last year alone we saw 10 deals that equaled $87 billion, with the biggest prize going to IBM when it bought Red Hat for a cool $34 billion, but even that wasn’t the biggest enterprise deal we could track down. In fact, we decided to compile a list of the biggest enterprise deals ever, so you could get a sense of where today’s deal fits.

Salesforce buys MuleSoft for $6.5 billion in 2018

At the time, this was the biggest deal Salesforce had ever done — until today. While the company has been highly acquisitive over the years, it had tended to keep the deals fairly compact for the most part, but it wanted MuleSoft to give it access to enterprise data wherever, it lived and it was willing to pay for it.

Microsoft buys GitHub for $7.5 billion in 2018

Not to be outdone by its rival, Microsoft opened its wallet almost exactly a year ago and bought GitHub for a hefty $7.5 billion. There was some hand-wringing in the developer community at the time, but so far, Microsoft has allowed the company to operate as an independent subsidiary.

SAP buys Qualtrics for $8 billion in 2018

SAP swooped in right before Qualtrics was about to IPO and gave it an offer it couldn’t refuse. Qualtrics gave SAP a tool for measuring customer satisfaction, something it had been lacking and was willing to pay big bucks for.

Oracle acquires NetSuite for $9.3 billion in 2016

It wasn’t really a surprise when Oracle acquired NetSuite. It had been an investor and Oracle needed a good SaaS tool at the time, as it was transitioning to the cloud. NetSuite gave it a ready-to-go packaged cloud service with a built-in set of customers it desperately needed.

Salesforce buys Tableau for $15.7 billion in 2019

That brings us to today’s deal. Salesforce swooped in again and paid an enormous sum of money for the Seattle software company, giving it a data visualization tool that would enable customers to create views of data wherever it lives, whether it’s part of Salesforce or not. What’s more, it was a great complement to last year’s MuleSoft acquisition.

Broadcom acquires CA Technologies for $18.9 billion in 2018

A huge deal in dollars from a year of big deals. Broadcom surprised a few people when a chip vendor paid this kind of money for a legacy enterprise software vendor and IT services company. The $18.9 billion represented a 20% premium for shareholders.

Microsoft snags LinkedIn for $26 billion in 2016

This was a company that Salesforce reportedly wanted badly at the time, but Microsoft was able to flex its financial muscles and come away the winner. The big prize was all of that data, and Microsoft has been working to turn that into products ever since.

IBM snares Red Hat for $34 billion in 2018

Near the end of last year, IBM made a huge move, acquiring Red Hat for $34 billion. IBM has been preaching a hybrid cloud approach for a number of years, and buying Red Hat gives it a much more compelling hybrid story.

Dell acquires EMC for $67 billion in 2016

This was the biggest of all, by far surpassing today’s deal. A deal this large was in the news for months as it passed various hurdles on the way to closing. Among the jewels that were included in this deal were VMware and Pivotal, the latter of which has since gone public. After this deal, Dell itself went public again last year.

Note: A reader on Twitter pointed out one we missed: Symantec bought Veritas for $13.5 billion in 2004.

Jun
10
2019
--

Salesforce is officially making Seattle its second HQ after its Tableau acquisition

Here’s an interesting by-product of the news today that Salesforce would be acquiring Tableau for $15.7 billion: the company is going to make Seattle, Wash. (home of Tableau) the official second headquarters of San Francisco-based Salesforce, putting the company directly in the face of tech giants and Salesforce frenemies Microsoft and Amazon.

“An HQ2, if you will,” Salesforce CEO Marc Benioff quipped right after he dropped the news during the press and analyst call.

HQ2, of course, is a reference to Amazon and its year-long, massively publicised, often criticised and ultimately botched search (it eventually cancelled plans to build an HQ in NYC, but kept Arlington) for its own second headquarters, which it also branded “HQ2.”

If real estate sends a message — and if you’ve ever seen Salesforce Tower in San Francisco, you know it does for this company — Salesforce is sending one here. And that message is: Hello, Microsoft and Amazon, we’re coming at you.

As we pointed out earlier today, there is a clear rivalry between Microsoft and Salesforce that first began to simmer in the area of CRM but has over time expanded to a wider array of products and services that cater to the needs of enterprise knowledge workers.

The most well-known of these was the tug-of-war between the two to acquire LinkedIn, a struggle that Microsoft ultimately won. Over the years, as both have continued to diversify their products to bring in a wider swathe of enterprise users, and across a wider range of use cases, that competition has become a little more pointed. (Indeed, here’s some perfect timing: just today, Microsoft expanded its business analytics tools.)

I’d argue that the competitive threat of Amazon is a little more remote. At the moment, in fact, the two work very closely: specifically in September last year, Amazon and Salesforce extended an already years-long deal to integrate AWS and Salesforce products to aid in enterprise “digital transformation” (one of Salesforce’s catch phrases).

Placing Salesforce physically closer to Amazon could even underscore how the two might work closer together in the future — not least because cloud storage is now a notably missing jewel in Salesforce’s enterprise IT crown as it squares up to Microsoft, which has Azure. (And it’s not just a Seattle thing. Google, which has Google Cloud Platform, acquired Tableau competitor Looker last week.)

On the other hand, you have to wonder about the longer-term trajectory for Salesforce and its ambitions. The Tableau deal takes it firmly into a new area of business that up to now has been more of a side-gig: data and analytics. Coming from two different directions — infrastructure for AWS and customer management for Salesforce — enterprise data has been a remote battleground for both companies for years already, and it will be interesting to see how the two sides approach it.

Notably, this is not Salesforce’s first efforts to lay down roots in the city. It established an engineering office in the city in 2017 and, as Benioff pointed out today, putting deeper roots into what he described as a “unique market with tremendous talent” will open up the company to tapping it even more.

Jun
10
2019
--

Microsoft Power Platform update aims to put AI in reach of business users

Low code and no code are the latest industry buzzwords, but if vendors can truly abstract away the complexity of difficult tasks like building machine learning models, it could help mainstream technologies that are currently out of reach of most business users. That’s precisely what Microsoft is aiming to do with its latest Power Platform announcements today.

The company tried to bring that low-code simplicity to building applications last year when it announced PowerApps. Now it believes by combining PowerApps with Microsoft Flow and its new AI Builder tool, it can allow folks building apps with PowerApps to add a layer of intelligence very quickly.

It starts with having access to data sources, and the Data Connector tool gives users access to more than 250 data connectors. That includes Salesforce, Oracle and Adobe, as well as, of course, Microsoft services like Office 365 and Dynamics 365. Richard Riley, senior director for Power Platform marketing, says this is the foundation for pulling data into AI Builder.

“AI Builder is all about making it just as easy in a low-code, no-code way to go bring artificial intelligence and machine learning into your Power Apps, into Microsoft Flow, into the Common Data Service, into your data connectors, and so on,” Riley told TechCrunch.

Screenshot: Microsoft

Charles Lamanna, general manager at Microsoft, says that Microsoft can do all the analysis and heavy lifting required to build a data model for you, removing a huge barrier to entry for business users. “The basic idea is that you can select any field in the Common Data Service and just say, ‘I want to predict this field.’ Then we’ll actually go look at historical records for that same table or entity to go predict [the results],” he explained. This could be used to predict if a customer will sign up for a credit card, if a customer is likely to churn, or if a loan would be approved, and so forth.

This announcement comes the same day that Salesforce announced it was buying Tableau for almost $16 billion, and days after Google bought Looker for $2.6 billion, and shows how powerful data can be in a business context, especially when providing a way to put that data to use, whether in the form of visualization or inside business applications.

While Microsoft admits AI Builder won’t be something everyone uses, they do see a kind of power user who might have been previously unable to approach this level of sophistication on their own, building apps and adding layers of intelligence without a heck of a lot of coding. If it works as advertised it will bring a level of simplicity to tasks that were previously well out of reach of business users without requiring a data scientist. Regardless, all of this activity shows data has become central to business, and vendors are going to build or buy to put it to work.

Jun
10
2019
--

Salesforce is buying data visualization company Tableau for $15.7B in all-stock deal

On the heels of Google buying analytics startup Looker last week for $2.6 billion, Salesforce today announced a huge piece of news in a bid to step up its own work in data visualization and (more generally) tools to help enterprises make sense of the sea of data that they use and amass: Salesforce is buying Tableau for $15.7 billion in an all-stock deal.

The latter is publicly traded and this deal will involve shares of Tableau Class A and Class B common stock getting exchanged for 1.103 shares of Salesforce common stock, the company said, and so the $15.7 billion figure is the enterprise value of the transaction, based on the average price of Salesforce’s shares as of June 7, 2019.

This is a huge jump on Tableau’s last market cap: it was valued at $10.79 billion at close of trading Friday, according to figures on Google Finance. (Also: trading has halted on its stock in light of this news.)

The two boards have already approved the deal, Salesforce notes. The two companies’ management teams will be hosting a conference call at 8am Eastern and I’ll listen in to that as well to get more details.

This is a huge deal for Salesforce as it continues to diversify beyond CRM software and into deeper layers of analytics.

The company reportedly worked hard to — but ultimately missed out on — buying LinkedIn (which Microsoft picked up instead), and while there isn’t a whole lot in common between LinkedIn and Tableau, this deal will also help Salesforce extend its engagement (and data intelligence) for the customers that Salesforce already has — something that LinkedIn would have also helped it to do.

This also looks like a move designed to help bulk up against Google’s move to buy Looker, announced last week, although I’d argue that analytics is a big enough area that all major tech companies that are courting enterprises are getting their ducks in a row in terms of squaring up to stronger strategies (and products) in this area. It’s unclear whether (and if) the two deals were made in response to each other, although it seems that Salesforce has been eyeing up Tableau for years.

“We are bringing together the world’s #1 CRM with the #1 analytics platform. Tableau helps people see and understand data, and Salesforce helps people engage and understand customers. It’s truly the best of both worlds for our customers–bringing together two critical platforms that every customer needs to understand their world,” said Marc Benioff, chairman and co-CEO, Salesforce, in a statement. “I’m thrilled to welcome Adam and his team to Salesforce.”

Tableau has about 86,000 business customers, including Charles Schwab, Verizon (which owns TC), Schneider Electric, Southwest and Netflix. Salesforce said Tableau will operate independently and under its own brand post-acquisition. It will also remain headquartered in Seattle, Wash., headed by CEO Adam Selipsky along with others on the current leadership team.

Indeed, later during the call, Benioff let it drop that Seattle would become Salesforce’s official second headquarters with the closing of this deal.

That’s not to say, though, that the two will not be working together.

On the contrary, Salesforce is already talking up the possibilities of expanding what the company is already doing with its Einstein platform (launched back in 2016, Einstein is the home of all of Salesforce’s AI-based initiatives); and with “Customer 360,” which is the company’s product and take on omnichannel sales and marketing. The latter is an obvious and complementary product home, given that one huge aspect of Tableau’s service is to provide “big picture” insights.

“Joining forces with Salesforce will enhance our ability to help people everywhere see and understand data,” said Selipsky. “As part of the world’s #1 CRM company, Tableau’s intuitive and powerful analytics will enable millions more people to discover actionable insights across their entire organizations. I’m delighted that our companies share very similar cultures and a relentless focus on customer success. I look forward to working together in support of our customers and communities.”

“Salesforce’s incredible success has always been based on anticipating the needs of our customers and providing them the solutions they need to grow their businesses,” said Keith Block, co-CEO, Salesforce. “Data is the foundation of every digital transformation, and the addition of Tableau will accelerate our ability to deliver customer success by enabling a truly unified and powerful view across all of a customer’s data.”

Jun
05
2019
--

Microsoft and Oracle link up their clouds

Microsoft and Oracle announced a new alliance today that will see the two companies directly connect their clouds over a direct network connection so that their users can then move workloads and data seamlessly between the two. This alliance goes a bit beyond just basic direct connectivity and also includes identity interoperability.

This kind of alliance is relatively unusual between what are essentially competing clouds, but while Oracle wants to be seen as a major player in this space, it also realizes that it isn’t likely to get to the size of an AWS, Azure or Google Cloud anytime soon. For Oracle, this alliance means that its users can run services like the Oracle E-Business Suite and Oracle JD Edwards on Azure while still using an Oracle database in the Oracle cloud, for example. With that, Microsoft still gets to run the workloads and Oracle gets to do what it does best (though Azure users will also continue be able to run their Oracle databases in the Azure cloud, too).

“The Oracle Cloud offers a complete suite of integrated applications for sales, service, marketing, human resources, finance, supply chain and manufacturing, plus highly automated and secure Generation 2 infrastructure featuring the Oracle Autonomous Database,” said Don Johnson, executive vice president, Oracle Cloud Infrastructure (OCI), in today’s announcement. “Oracle and Microsoft have served enterprise customer needs for decades. With this alliance, our joint customers can migrate their entire set of existing applications to the cloud without having to re-architect anything, preserving the large investments they have already made.”

For now, the direct interconnect between the two clouds is limited to Azure US East and Oracle’s Ashburn data center. The two companies plan to expand this alliance to other regions in the future, though they remain mum on the details. It’ll support applications like JD Edwards EnterpriseOne, E-Business Suite, PeopleSoft, Oracle Retail and Hyperion on Azure, in combination with Oracle databases like RAC, Exadata and the Oracle Autonomous Database running in the Oracle Cloud.

“As the cloud of choice for the enterprise, with over 95% of the Fortune 500 using Azure, we have always been first and foremost focused on helping our customers thrive on their digital transformation journeys,” said Scott Guthrie, executive vice president of Microsoft’s Cloud and AI division. “With Oracle’s enterprise expertise, this alliance is a natural choice for us as we help our joint customers accelerate the migration of enterprise applications and databases to the public cloud.”

Today’s announcement also fits within a wider trend at Microsoft, which has recently started building a number of alliances with other large enterprise players, including its open data alliance with SAP and Adobe, as well as a somewhat unorthodox gaming partnership with Sony.

 

Jun
04
2019
--

How Kubernetes came to rule the world

Open source has become the de facto standard for building the software that underpins the complex infrastructure that runs everything from your favorite mobile apps to your company’s barely usable expense tool. Over the course of the last few years, a lot of new software is being deployed on top of Kubernetes, the tool for managing large server clusters running containers that Google open sourced five years ago.

Today, Kubernetes is the fastest growing open-source project and earlier this month, the bi-annual KubeCon+CloudNativeCon conference attracted almost 8,000 developers to sunny Barcelona, Spain, making the event the largest open-source conference in Europe yet.

To talk about how Kubernetes came to be, I sat down with Craig McLuckie, one of the co-founders of Kubernetes at Google (who then went on to his own startup, Heptio, which he sold to VMware); Tim Hockin, another Googler who was an early member on the project and was also on Google’s Borg team; and Gabe Monroy, who co-founded Deis, one of the first successful Kubernetes startups, and then sold it to Microsoft, where he is now the lead PM for Azure Container Compute (and often the public face of Microsoft’s efforts in this area).

Google’s cloud and the rise of containers

To set the stage a bit, it’s worth remembering where Google Cloud and container management were five years ago.

May
23
2019
--

Serverless and containers: Two great technologies that work better together

Cloud native models using containerized software in a continuous delivery approach could benefit from serverless computing where the cloud vendor generates the exact amount of resources required to run a workload on the fly. While the major cloud vendors have recognized this and are already creating products to abstract away the infrastructure, it may not work for every situation in spite of the benefits.

Cloud native, put simply, involves using containerized applications and Kubernetes to deliver software in small packages called microservices. This enables developers to build and deliver software faster and more efficiently in a continuous delivery model. In the cloud native world, you should be able to develop code once and run it anywhere, on prem or any public cloud, or at least that is the ideal.

Serverless is actually a bit of a misnomer. There are servers underlying the model, but instead of dedicated virtual machines, the cloud vendor delivers exactly the right number of resources to run a particular workload for the right amount of time and no more.

Nothing is perfect

Such an arrangement would seem to be perfectly suited to a continuous delivery model, and while vendors have recognized the beauty of such an approach, as one engineer pointed out, there is never a free lunch in processes that are this complex, and it won’t be a perfect solution for every situation.

Arpana Sinha, director of product management at Google, says the Kubernetes community has really embraced the serverless idea, but she says that it is limited in its current implementation, delivered in the form of functions with products like AWS Lambda, Google Cloud Functions and Azure Functions.

“Actually, I think the functions concept is a limited concept. It is unfortunate that that is the only thing that people associate with serverless,” she said.

She says that Google has tried to be more expansive in its definition. “It’s basically a concept for developers where you are able to seamlessly go from writing code to deployment and the infrastructure takes care of all of the rest, making sure your code is deployed in the appropriate way across the appropriate, most resilient parts of the infrastructure, scaling it as your app needs additional resources, scaling it down as your traffic goes down, and charging you only for what you’re consuming,” she explained.

But Matt Whittington, senior engineer on the Kubernetes Team at Atlassian says, while it sounds good in theory, in practice, fully automated infrastructure could be unrealistic in some instances. “Serverless could be promising for certain workloads because it really allows developers to focus on the code, but it’s not a perfect solution. There is still some underlying tuning.”

He says you may not be able to leave it completely up to the vendor unless there is a way to specify the requirements for each container, such as instructing them you need a minimum container load time, a certain container kill time or perhaps you need to deliver it a specific location. He says in reality it won’t be fully automated, at least while developers fiddle with the settings to make sure they are getting the resources they need without over-provisioning and paying for more than they need.

Vendors bringing solutions

The vendors are putting in their two cents trying to create tools that bring this ideal together. For instance, Google announced a service called Google Cloud Run at Google Cloud Next last month. It’s based on the open-source Knative project, and in essence combines the goodness of serverless for developers running containers. Other similar services include AWS Fargate and Azure Container Instances, both of which are attempting to bring together these two technologies in a similar package.

In fact, Gabe Monroy, partner program manager at Microsoft, says Azure Container Instances is designed to solve this problem without being dependent on a functions-driven programming approach. “What Azure Container Instances does is it allows you to run containers directly on the Azure compute fabric, no virtual machines, hypervisor isolated, pay-per-second billing. We call it serverless containers,” he said.

While serverless and containers might seem like a good fit, as Monroy points out, there isn’t a one-size-fits-all approach to cloud-native technologies, whatever the approach may be. Some people will continue to use a function-driven serverless approach like AWS Lambda or Azure Functions and others will shift to containers and look for other ways to bring these technologies together. Whatever happens, as developer needs change, it is clear the open-source community and vendors will respond with tools to help them. Bringing serverless and containers together is just one example of that.

May
21
2019
--

Microsoft makes a push for service mesh interoperability

Services meshes. They are the hot new thing in the cloud native computing world. At KubeCon, the bi-annual festival of all things cloud native, Microsoft today announced that it is teaming up with a number of companies in this space to create a generic service mesh interface. This will make it easier for developers to adopt the concept without locking them into a specific technology.

In a world where the number of network endpoints continues to increase as developers launch new micro-services, containers and other systems at a rapid clip, they are making the network smarter again by handling encryption, traffic management and other functions so that the actual applications don’t have to worry about that. With a number of competing service mesh technologies, though, including the likes of Istio and Linkerd, developers currently have to choose which one of these to support.

“I’m really thrilled to see that we were able to pull together a pretty broad consortium of folks from across the industry to help us drive some interoperability in the service mesh space,” Gabe Monroy, Microsoft’s lead product manager for containers and the former CTO of Deis, told me. “This is obviously hot technology — and for good reasons. The cloud-native ecosystem is driving the need for smarter networks and smarter pipes and service mesh technology provides answers.”

The partners here include Buoyant, HashiCorp, Solo.io, Red Hat, AspenMesh, Weaveworks, Docker, Rancher, Pivotal, Kinvolk and VMware . That’s a pretty broad coalition, though it notably doesn’t include cloud heavyweights like Google, the company behind Istio, and AWS.

“In a rapidly evolving ecosystem, having a set of common standards is critical to preserving the best possible end-user experience,” said Idit Levine, founder and CEO of Solo.io. “This was the vision behind SuperGloo — to create an abstraction layer for consistency across different meshes, which led us to the release of Service Mesh Hub last week. We are excited to see service mesh adoption evolve into an industry-level initiative with the SMI specification.”

For the time being, the interoperability features focus on traffic policy, telemetry and traffic management. Monroy argues that these are the most pressing problems right now. He also stressed that this common interface still allows the different service mesh tools to innovate and that developers can always work directly with their APIs when needed. He also stressed that the Service Mesh Interface (SMI), as this new specification is called, does not provide any of its own implementations of these features. It only defines a common set of APIs.

Currently, the most well-known service mesh is probably Istio, which Google, IBM and Lyft launched about two years ago. SMI may just bring a bit more competition to this market since it will allow developers to bet on the overall idea of a service mesh instead of a specific implementation.

In addition to SMI, Microsoft also today announced a couple of other updates around its cloud-native and Kubernetes services. It announced the first alpha of the Helm 3 package manager, for example, as well as the 1.0 release of its Kubernetes extension for Visual Studio Code and the general availability of its AKS virtual nodes, using the open source Virtual Kubelet project.

May
13
2019
--

Announcing TechCrunch Sessions: Enterprise this September in San Francisco

Of the many categories in the tech world, none is more ferociously competitive than enterprise. For decades, SAP, Oracle, Adobe, Microsoft, IBM and Salesforce, to name a few of the giants, have battled to deliver the tools businesses want to become more productive and competitive. That market is closing in on $500 billion in sales per year, which explains why hundreds of new enterprise startups launch every year and dozens are acquired by the big incumbents trying to maintain their edge.

Last year alone, the top 10 enterprise acquisitions were worth $87 billion and included IBM’s acquiring Red Hat for $34 billion, SAP paying $8 billion for Qualtrics, Microsoft landing GitHub for $7.5 billion, Salesforce acquiring MuleSoft for $6.5 billion and Adobe grabbing Marketo for $4.75 billion. No startup category has made more VCs and founders wildly wealthy, and none has seen more mighty companies rise faster or fall harder. That technology and business thrill ride makes enterprise a category TechCrunch has long wanted to tackle head on.

TC Sessions: Enterprise (September 5 at San Francisco’s Yerba Buena Center) will take on the big challenges and promise facing enterprise companies today. TechCrunch’s editors, notably Frederic Lardinois, Ron Miller and Connie Loizos, will bring to the stage founders and leaders from established and emerging companies to address rising questions like the promised revolution from machine learning and AI, intelligent marketing automation and the inevitability of the cloud, as well as the outer reaches of technology, like quantum and blockchain.

We’ll enlist proven enterprise-focused VCs to reveal where they are directing their early, middle and late-stage investments. And we’ll ask the most proven serial entrepreneurs to tell us what it really took to build that company, and which company they would like to create next. All throughout the show, TechCrunch’s editors will zero in on emerging enterprise technologies to sort the hype from the reality. Whether you are a founder, an investor, enterprise-minded engineer or a corporate CTO / CIO, TC Sessions: Enterprise will provide a valuable day of new insights and great networking.

Tickets are now available for purchase on our website at the early-bird rate of $395. Want to bring a group of people from your company? Get an automatic 15% savings when you purchase four or more tickets at once. Are you an early-stage startup? We have a limited number of Startup Demo Packages available for $2,000, which includes four tickets to attend the event. Students are invited to apply for a reduced-price student ticket at just $245. Additionally, for each ticket purchased for TC Sessions: Enterprise, you will also be registered for a complimentary Expo Only pass to TechCrunch Disrupt SF on October 2-4.

Interested in sponsoring TC Sessions: Enterprise? Fill out this form and a member of our sales team will contact you.

May
07
2019
--

Red Hat and Microsoft are cozying up some more with Azure Red Hat OpenShift

It won’t be long before Red Hat becomes part of IBM, the result of the $34 billion acquisition last year that is still making its way to completion. For now, Red Hat continues as a stand-alone company, and is if to flex its independence muscles, it announced its second agreement in two days with Microsoft Azure, Redmond’s public cloud infrastructure offering. This one involving running Red Hat OpenShift on Azure.

OpenShift is RedHat’s Kubernetes offering. The thinking is that you can start with OpenShift in your data center, then as you begin to shift to the cloud, you can move to Azure Red Hat OpenShift — such a catchy name — without any fuss, as you have the same management tools you have been used to using.

As Red Hat becomes part of IBM, it sees that it’s more important than ever to maintain its sense of autonomy in the eyes of developers and operations customers, as it holds its final customer conference as an independent company. Red Hat executive vice president and president, of products and technologies certainly sees it that way. “I think [the partnership] is a testament to, even with moving to IBM at some point soon, that we are going to be  separate and really keep our Switzerland status and give the same experience for developers and operators across anyone’s cloud,” he told TechCrunch.

It’s essential to see this announcement in the context of both IBM’s and Microsoft’s increasing focus on the hybrid cloud, and also in the continuing requirement for cloud companies to find ways to work together, even when it doesn’t always seem to make sense, because as Microsoft CEO Satya Nadella has said, customers will demand it. Red Hat has a big enterprise customer presence and so does Microsoft. If you put them together, it could be the beginning of a beautiful friendship.

Scott Guthrie, executive vice president for the cloud and AI group at Microsoft understands that. “Microsoft and Red Hat share a common goal of empowering enterprises to create a hybrid cloud environment that meets their current and future business needs. Azure Red Hat OpenShift combines the enterprise leadership of Azure with the power of Red Hat OpenShift to simplify container management on Kubernetes and help customers innovate on their cloud journeys,” he said in a statement.

This news comes on the heels of yesterday’s announcement, also involving Kubernetes. TechCrunch’s own Frederic Lardinois described it this way:

What’s most interesting here, however, is KEDA, a new open-source collaboration between Red Hat and Microsoft that helps developers deploy serverless, event-driven containers. Kubernetes-based event-driven autoscaling, or KEDA, as the tool is called, allows users to build their own event-driven applications on top of Kubernetes. KEDA handles the triggers to respond to events that happen in other services and scales workloads as needed.

Azure Red Hat OpenShift is available now on Azure. The companies are working on some other integrations too including Red Hat Enterprise Linux (RHEL) running on Azure and Red Hat Enterprise Linux 8 support in Microsoft SQL Server 2019.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com