May
21
2019
--

Microsoft makes a push for service mesh interoperability

Services meshes. They are the hot new thing in the cloud native computing world. At KubeCon, the bi-annual festival of all things cloud native, Microsoft today announced that it is teaming up with a number of companies in this space to create a generic service mesh interface. This will make it easier for developers to adopt the concept without locking them into a specific technology.

In a world where the number of network endpoints continues to increase as developers launch new micro-services, containers and other systems at a rapid clip, they are making the network smarter again by handling encryption, traffic management and other functions so that the actual applications don’t have to worry about that. With a number of competing service mesh technologies, though, including the likes of Istio and Linkerd, developers currently have to choose which one of these to support.

“I’m really thrilled to see that we were able to pull together a pretty broad consortium of folks from across the industry to help us drive some interoperability in the service mesh space,” Gabe Monroy, Microsoft’s lead product manager for containers and the former CTO of Deis, told me. “This is obviously hot technology — and for good reasons. The cloud-native ecosystem is driving the need for smarter networks and smarter pipes and service mesh technology provides answers.”

The partners here include Buoyant, HashiCorp, Solo.io, Red Hat, AspenMesh, Weaveworks, Docker, Rancher, Pivotal, Kinvolk and VMware . That’s a pretty broad coalition, though it notably doesn’t include cloud heavyweights like Google, the company behind Istio, and AWS.

“In a rapidly evolving ecosystem, having a set of common standards is critical to preserving the best possible end-user experience,” said Idit Levine, founder and CEO of Solo.io. “This was the vision behind SuperGloo — to create an abstraction layer for consistency across different meshes, which led us to the release of Service Mesh Hub last week. We are excited to see service mesh adoption evolve into an industry-level initiative with the SMI specification.”

For the time being, the interoperability features focus on traffic policy, telemetry and traffic management. Monroy argues that these are the most pressing problems right now. He also stressed that this common interface still allows the different service mesh tools to innovate and that developers can always work directly with their APIs when needed. He also stressed that the Service Mesh Interface (SMI), as this new specification is called, does not provide any of its own implementations of these features. It only defines a common set of APIs.

Currently, the most well-known service mesh is probably Istio, which Google, IBM and Lyft launched about two years ago. SMI may just bring a bit more competition to this market since it will allow developers to bet on the overall idea of a service mesh instead of a specific implementation.

In addition to SMI, Microsoft also today announced a couple of other updates around its cloud-native and Kubernetes services. It announced the first alpha of the Helm 3 package manager, for example, as well as the 1.0 release of its Kubernetes extension for Visual Studio Code and the general availability of its AKS virtual nodes, using the open source Virtual Kubelet project.

May
13
2019
--

Announcing TechCrunch Sessions: Enterprise this September in San Francisco

Of the many categories in the tech world, none is more ferociously competitive than enterprise. For decades, SAP, Oracle, Adobe, Microsoft, IBM and Salesforce, to name a few of the giants, have battled to deliver the tools businesses want to become more productive and competitive. That market is closing in on $500 billion in sales per year, which explains why hundreds of new enterprise startups launch every year and dozens are acquired by the big incumbents trying to maintain their edge.

Last year alone, the top 10 enterprise acquisitions were worth $87 billion and included IBM’s acquiring Red Hat for $34 billion, SAP paying $8 billion for Qualtrics, Microsoft landing GitHub for $7.5 billion, Salesforce acquiring MuleSoft for $6.5 billion and Adobe grabbing Marketo for $4.75 billion. No startup category has made more VCs and founders wildly wealthy, and none has seen more mighty companies rise faster or fall harder. That technology and business thrill ride makes enterprise a category TechCrunch has long wanted to tackle head on.

TC Sessions: Enterprise (September 5 at San Francisco’s Yerba Buena Center) will take on the big challenges and promise facing enterprise companies today. TechCrunch’s editors, notably Frederic Lardinois, Ron Miller and Connie Loizos, will bring to the stage founders and leaders from established and emerging companies to address rising questions like the promised revolution from machine learning and AI, intelligent marketing automation and the inevitability of the cloud, as well as the outer reaches of technology, like quantum and blockchain.

We’ll enlist proven enterprise-focused VCs to reveal where they are directing their early, middle and late-stage investments. And we’ll ask the most proven serial entrepreneurs to tell us what it really took to build that company, and which company they would like to create next. All throughout the show, TechCrunch’s editors will zero in on emerging enterprise technologies to sort the hype from the reality. Whether you are a founder, an investor, enterprise-minded engineer or a corporate CTO / CIO, TC Sessions: Enterprise will provide a valuable day of new insights and great networking.

Tickets are now available for purchase on our website at the early-bird rate of $395. Want to bring a group of people from your company? Get an automatic 15% savings when you purchase four or more tickets at once. Are you an early-stage startup? We have a limited number of Startup Demo Packages available for $2,000, which includes four tickets to attend the event. Students are invited to apply for a reduced-price student ticket at just $245. Additionally, for each ticket purchased for TC Sessions: Enterprise, you will also be registered for a complimentary Expo Only pass to TechCrunch Disrupt SF on October 2-4.

Interested in sponsoring TC Sessions: Enterprise? Fill out this form and a member of our sales team will contact you.

Apr
05
2019
--

On balance, the cloud has been a huge boon to startups

Today’s startups have a distinct advantage when it comes to launching a company because of the public cloud. You don’t have to build infrastructure or worry about what happens when you scale too quickly. The cloud vendors take care of all that for you.

But last month when Pinterest announced its IPO, the company’s cloud spend raised eyebrows. You see, the company is spending $750 million a year on cloud services, more specifically to AWS. When your business is primarily focused on photos and video, and needs to scale at a regular basis, that bill is going to be high.

That price tag prompted Erica Joy, a Microsoft engineer to publish this Tweet and start a little internal debate here at TechCrunch. Startups, after all, have a dog in this fight, and it’s worth exploring if the cloud is helping feed the startup ecosystem, or sending your bills soaring as they have with Pinterest.

For starters, it’s worth pointing out that Ms. Joy works for Microsoft, which just happens to be a primary competitor of Amazon’s in the cloud business. Regardless of her personal feelings on the matter, I’m sure Microsoft would be more than happy to take over that $750 million bill from Amazon. It’s a nice chunk of business, but all that aside, do startups benefit from having access to cloud vendors?

Apr
02
2019
--

Cloud Foundry ? Kubernetes

Cloud Foundry, the open-source platform-as-a-service project that more than half of the Fortune 500 companies use to help them build, test and deploy their applications, launched well before Kubernetes existed. Because of this, the team ended up building Diego, its own container management service. Unsurprisingly, given the popularity of Kubernetes, which has become somewhat of the de facto standard for container orchestration, a number of companies in the Cloud Foundry ecosystem starting looking into how they could use Kubernetes to replace Diego.

The result of this is Project Eirini, which was first proposed by IBM. As the Cloud Foundry Foundation announced today, Project Eirini now passes the core functional tests the team runs to validate the software releases of its application runtime, the core Cloud Foundry service that deploys and manages applications (if that’s a bit confusing, don’t even think about the fact that there’s also a Cloud Foundry Container Runtime, which already uses Kubernetes, but which is mostly meant to give enterprise a single platform for running their own applications and pre-built containers from third-party vendors).

a foundry for clouds“That’s a pretty big milestone,” Cloud Foundry Foundation CTO Chip Childers told me. “The project team now gets to shift to a mode where they’re focused on hardening the solution and making it a bit more production-ready. But at this point, early adopters are also starting to deploy that [new] architecture.”

Childers stressed that while the project was incubated by IBM, which has been a long-time backer of the overall Cloud Foundry project, Google, Pivotal and others are now also contributing and have dedicated full-time engineers working on the project. In addition, SUSE, SAP and IBM are also active in developing Eirini.

Eirini started as an incubation project, and while few doubted that this would be a successful project, there was a bit of confusion around how Cloud Foundry would move forward now that it essentially had two container engines for running its core service. At the time, there was even some concern that the project could fork. “I pushed back at the time and said: no, this is the natural exploration process that open-source communities need to go through,” Childers said. “What we’re seeing now is that with Pivotal and Google stepping in, that’s a very clear sign that this is going to be the go-forward architecture for the future of the Cloud Foundry Application Runtime.”

A few months ago, by the way, Kubernetes was still missing a few crucial pieces the Cloud Foundry ecosystem needed to make this move. Childers specifically noted that Windows support — something the project’s enterprise users really need — was still problematic and lacked some important features. In recent releases, though, the Kubernetes team fixed most of these issues and improved its Windows support, rendering those issues moot.

What does all of this mean for Diego? Childers noted that the community isn’t at a point where it’ll hold developing that tool. At some point, though, it seems likely that the community will decide that it’s time to start the transition period and make the move to Kubernetes official.

It’s worth noting that IBM today announced its own preview of Eirini in its Cloud Foundry Enterprise Environment and that the latest version of SUSE’s Cloud Foundry-based Application Platform includes a similar preview as well.

In addition, the Cloud Foundry Foundation, which is hosting its semi-annual developer conference in Philadelphia this week, also announced that it has certified it first to systems integrators, Accenture and HCL as part of its recently launched certification program for companies that work in the Cloud Foundry ecosystem and have at least 10 certified developers on their teams.

Feb
12
2019
--

Google and IBM still trying desperately to move cloud market-share needle

When it comes to the cloud market, there are few known knowns. For instance, we know that AWS is the market leader with around 32 percent of market share. We know Microsoft is far back in second place with around 14 percent, the only other company in double digits. We also know that IBM and Google are wallowing in third or fourth place, depending on whose numbers you look at, stuck in single digits. The market keeps expanding, but these two major companies never seem to get a much bigger piece of the pie.

Neither company is satisfied with that, of course. Google so much so that it moved on from Diane Greene at the end of last year, bringing in Oracle veteran Thomas Kurian to lead the division out of the doldrums. Meanwhile, IBM made an even bigger splash, plucking Red Hat from the market for $34 billion in October.

This week, the two companies made some more noise, letting the cloud market know that they are not ceding the market to anyone. For IBM, which is holding its big IBM Think conference this week in San Francisco, it involved opening up Watson to competitor clouds. For a company like IBM, this was a huge move, akin to when Microsoft started building apps for iOS. It was an acknowledgement that working across platforms matters, and that if you want to gain market share, you had better start thinking outside the box.

While becoming cross-platform compatible isn’t exactly a radical notion in general, it most certainly is for a company like IBM, which if it had its druthers and a bit more market share, would probably have been content to maintain the status quo. But if the majority of your customers are pursuing a multi-cloud strategy, it might be a good idea for you to jump on the bandwagon — and that’s precisely what IBM has done by opening up access to Watson across clouds in this fashion.

Clearly buying Red Hat was about a hybrid cloud play, and if IBM is serious about that approach, and for $34 billion, it had better be — it would have to walk the walk, not just talk the talk. As IBM Watson CTO and chief architect Ruchir Puri told my colleague Frederic Lardinois about the move, “It’s in these hybrid environments, they’ve got multiple cloud implementations, they have data in their private cloud as well. They have been struggling because the providers of AI have been trying to lock them into a particular implementation that is not suitable to this hybrid cloud environment.” This plays right into the Red Hat strategy, and I’m betting you’ll see more of this approach in other parts of the product line from IBM this year. (Google also acknowledged this when it announced a hybrid strategy of its own last year.)

Meanwhile, Thomas Kurian had his coming-out party at the Goldman Sachs Technology and Internet Conference in San Francisco earlier today. Bloomberg reports that he announced a plan to increase the number of salespeople and train them to understand specific verticals, ripping a page straight from the playbook of his former employer, Oracle.

He suggested that his company would be more aggressive in pursuing traditional enterprise customers, although I’m sure his predecessor, Diane Greene, wasn’t exactly sitting around counting on inbound marketing interest to grow sales. In fact, rumor had it that she wanted to pursue government contracts much more aggressively than the company was willing to do. Now it’s up to Kurian to grow sales. Of course, given that Google doesn’t report cloud revenue it’s hard to know what growth would look like, but perhaps if it has more success it will be more forthcoming.

As Bloomberg’s Shira Ovide tweeted today, it’s one thing to turn to the tried and true enterprise playbook, but that doesn’t mean that executing on that approach is going to be simple, or that Google will be successful in the end.

These two companies obviously desperately want to alter their cloud fortunes, which have been fairly dismal to this point. The moves announced today are clearly part of a broader strategy to move the market share needle, but whether they can or the market positions have long ago hardened remains to be seen.

Jan
17
2019
--

IBM and Vodafone form cloud, 5G and AI business venture and ink $550M service deal

IBM is one of the world’s biggest system integrators, but to get closer to where enterprises are actually doing their work, it’s been inking partnerships with companies that build devices and run the networks enterprises are using for their IT, and today comes the latest development on that front.

IBM is announcing a new venture with mobile carrier Vodafone, in a deal that will comes in two parts. First, IBM will supply Vodafone’s B2B unit Vodafone Business with managed services in the areas of cloud and hosting. And second, the two will together work on building and delivering solutions in areas like AI, cloud, 5G, IoT and software defined networking to enterprise customers.

The latter part of the deal appears to be a classic JV that will see both sides bringing something to the table — employees from both companies will be moving into a separate office together very soon that will essentially be “neutral” territory. The former part, meanwhile, will see Vodafone paying IBM some $550 million in an eight-year agreement.

That price tag alone is a strong indicator that this deal is a big one for both companies.

The agreement follows along the lines of what IBM inked with Apple several years ago, where the two would work together to develop enterprise solutions that would have been more challenging to do on their own.

Indeed, while IBM does provide systems integration services, it hasn’t moved as deeply into mobile-specific solutions for businesses, even as its other operational units — doing research and other work in AI, cloud, quantum computing and other areas — are making strong headway on specific projects, some of which involve mobile technology. Now that it’s nearly in full possession of RedHat — which it is in the process of buying for $34 billion, a deal that’s now received the approval of RedHat’s shareholders — it will also have open source cloud computing to add to that.

What the Vodafone deal will tap is taking more of those cutting-edge developments that IBM has built and worked on in specific projects, and productise them for a wider audience of businesses and other organisations, which might already be Vodafone customers.

“To deliver multi-cloud strategies in the real world, enterprises need to invest at many levels, ranging from cloud connectivity to cloud governance and management. This new venture between Vodafone and IBM addresses the ‘full stack’ of real-world multi-cloud concerns with a powerful combination of capabilities that should enable customers to deliver multi-cloud strategies in all layers of their organizations,” noted Carla Arend, senior program director for European software at IDC.

The Apple / IBM deal is more than instructive in this case; it will help fuel this new venture. From what I understand, several fruits of that labor will be making their way into the IBM / Vodafone deal, too, which makes sense, considering Vodafone’s position as a mobile carrier and the iPhone making some strong headway into the business market.

“IBM has built industry-leading hybrid cloud, AI and security capabilities underpinned by deep industry expertise,” said IBM Chairman, President and CEO Ginni Rometty in a statement. “Together, IBM and Vodafone will use the power of the hybrid cloud to securely integrate critical business applications, driving business innovation – from agriculture to next- generation retail.”

“Vodafone has successfully established its cloud business to help our customers succeed in a digital world,” said Vodafone CEO Nick Read, in the statement. “This strategic venture with IBM allows us to focus on our strengths in fixed and mobile technologies, whilst leveraging IBM’s expertise in multicloud, AI and services. Through this new venture we’ll accelerate our growth and deepen engagement with our customers while driving radical simplification and efficiency in our business.”

I’ve been told that the first joint “customer engagements” are already happening with an unnamed energy company. Thinking about what kinds of services Vodafone may be providing to end users today — they will cover mobile data and voice connectivity, mobile broadband, IoT and 5G services — this first deal will involve tapping all four, with an emphasis on 5G and IoT.

Jan
08
2019
--

Amid a legal fight in LA, IBM’s Weather Company launches hyperlocal weather forecasts globally

While IBM is getting sued by the city of Los Angeles, accusing it of covertly mining user data in the Weather Channel app in the US, it’s testing the waters for another hyperlocal weather feature that — coincidentally — relies on data that it picks up from sensors on app users’ smartphones, among other devices, combined with AI at IBM’s end to help model the information.

Today at CES, the company announced new service called the Global High-Resolution Atmospheric Forecasting System — GRAF for short — a new weather forecasting system that says it will provide the most accurate weather for anywhere in the world, running every hour, and in increments of every three kilometers everywhere by way of crunching around 10 terabytes of data every day.

The new hyperlocal weather data will start to become available in 2019.

This is a key piece of news particularly for the developing world. There has been some effort already to create and use hyperlocal weather information in the US market using things like in-built sensors that can pick up information on, for example, barometric pressure — the very feature that is now the subject of a lawsuit — but there have been fewer efforts to bring that kind of service to a wider, global audience.

“If you’re a farmer in Kenya or Kansas, you will get a way better weather prediction,” said Ginny Rometty, the CEO of IBM, announcing the service today at CES.

She added that other potential end users of the data could include airlines to better predict when a plane might encounter turbulence or other patterns that could affect a flight; insurance companies managing recovery operations and claims around natural disasters; and utility companies monitoring for faults or preparing for severe weather strains on their systems.

Rometty said that the Weather Channel app’s 100 million users — and, in an estimation from Mary Glackin, the Weather Channel’s VP of business solutions, 300 million monthly active users when considering the wider network of places where the data gets used including Weather.com and Weather Underground — will be providing the data “with consent”. Data sourced from businesses will be coming from customers that are partners and are also likely to become users of the data.

That data in turn will be run through IBM’s Power9 supercomputers, the same ones used in the US Department of Energy’s Summit and Sierra  supercomputers, and modelled using suplementary data from the National Center for Atmospheric Research (NCAR).

The news represents a big step change for the Weather Company and for meteorology research, Glackin said in an interview.

“This is going to be the first significant implementation of GPUs at the Weather Company,” she told me. “The weather community has been slow to adopt to technology, but this is providing much improved performance for us, with higher resolutions and a much finer scale and focus of short-term forecasts.”

The new service of providing hyperlocal data also underscores an interesting turn for IBM as it turns its efforts to building the Weather Channel business into a more global operation, and one that helps deliver more business returns for IBM itself.

Glackin said the Weather Channel app was the most-downloaded weather app in India last year, underscoring how it, like other consumer apps, is seeing more growth outside of the US at the moment after already reaching market saturation in its home market.

Saturation, and some controversy. It’s not clear how the lawsuit in LA will play out, but the fact that it’s been filed definitely points to changing opinions and sensibilities when it comes to the use of personal data, and more generally how consumers and authorities are starting to think about how all that data that we are generating every day on our connected devices is getting used.

IBM is by far not the only company, nor the most vilified, when it comes to this issue, but at a time when the company is still trying to capitalise on the potential of how to commercialise the trove of information and customer connections in its wider business network, this will be something that will impact it as well.

Notably, Rometty closed off her keynote today at CES with a few parting words that reference that.

“As we work on these technologies, all that data that we talked about, that ownership, they belong to the user, and with their permission, we use that,” she said, adding, “These technologies also need to be open and explainable.”

Jan
08
2019
--

IBM unveils its first commercial quantum computer

At CES, IBM today announced its first commercial quantum computer for use outside of the lab. The 20-qubit system combines into a single package the quantum and classical computing parts it takes to use a machine like this for research and business applications. That package, the IBM Q system, is still huge, of course, but it includes everything a company would need to get started with its quantum computing experiments, including all the machinery necessary to cool the quantum computing hardware.

While IBM describes it as the first fully integrated universal quantum computing system designed for scientific and commercial use, it’s worth stressing that a 20-qubit machine is nowhere near powerful enough for most of the commercial applications that people envision for a quantum computer with more qubits — and qubits that are useful for more than 100 microseconds. It’s no surprise then, that IBM stresses that this is a first attempt and that the systems are “designed to one day tackle problems that are currently seen as too complex and exponential in nature for classical systems to handle.” Right now, we’re not quite there yet, but the company also notes that these systems are upgradable (and easy to maintain).

“The IBM Q System One is a major step forward in the commercialization of quantum computing,” said Arvind Krishna, senior vice president of Hybrid Cloud and director of IBM Research. “This new system is critical in expanding quantum computing beyond the walls of the research lab as we work to develop practical quantum applications for business and science.”

More than anything, though, IBM seems to be proud of the design of the Q systems. In a move that harkens back to Cray’s supercomputers with its expensive couches, IBM worked with design studios Map Project Office and Universal Design Studio, as well Goppion, the company that has built, among other things, the display cases that house the U.K.’s crown jewels and the Mona Lisa. IBM clearly thinks of the Q system as a piece of art and, indeed, the final result is quite stunning. It’s a nine-foot-tall and nine-foot-wide airtight box, with the quantum computing chandelier hanging in the middle, with all of the parts neatly hidden away.

If you want to buy yourself a quantum computer, you’ll have to work with IBM, though. It won’t be available with free two-day shipping on Amazon anytime soon.

In related news, IBM also announced the IBM Q Network, a partnership with ExxonMobil and research labs like CERN and Fermilab that aims to build a community that brings together the business and research interests to explore use cases for quantum computing. The organizations that partner with IBM will get access to its quantum software and cloud-based quantum computing systems.

CES 2019 coverage - TechCrunch

Dec
11
2018
--

The Cloud Native Computing Foundation adds etcd to its open-source stable

The Cloud Native Computing Foundation (CNCF), the open-source home of projects like Kubernetes and Vitess, today announced that its technical committee has voted to bring a new project on board. That project is etcd, the distributed key-value store that was first developed by CoreOS (now owned by Red Hat, which in turn will soon be owned by IBM). Red Hat has now contributed this project to the CNCF.

Etcd, which is written in Go, is already a major component of many Kubernetes deployments, where it functions as a source of truth for coordinating clusters and managing the state of the system. Other open-source projects that use etcd include Cloud Foundry, and companies that use it in production include Alibaba, ING, Pinterest, Uber, The New York Times and Nordstrom.

“Kubernetes and many other projects like Cloud Foundry depend on etcd for reliable data storage. We’re excited to have etcd join CNCF as an incubation project and look forward to cultivating its community by improving its technical documentation, governance and more,” said Chris Aniszczyk, COO of CNCF, in today’s announcement. “Etcd is a fantastic addition to our community of projects.”

Today, etcd has well over 450 contributors and nine maintainers from eight different companies. The fact that it ended up at the CNCF is only logical, given that the foundation is also the host of Kubernetes. With this, the CNCF now plays host to 17 projects that fall under its “incubated technologies” umbrella. In addition to etcd, these include OpenTracing, Fluentd, Linkerd, gRPC, CoreDNS, containerd, rkt, CNI, Jaeger, Notary, TUF, Vitess, NATS Helm, Rook and Harbor. Kubernetes, Prometheus and Envoy have already graduated from this incubation stage.

That’s a lot of projects for one foundation to manage, but the CNCF community is also extraordinarily large. This week alone about 8,000 developers are converging on Seattle for KubeCon/CloudNativeCon, the organization’s biggest event yet, to talk all things containers. It surely helps that the CNCF has managed to bring competitors like AWS, Microsoft, Google, IBM and Oracle under a single roof to collaboratively work on building these new technologies. There is a risk of losing focus here, though, something that happened to the OpenStack project when it went through a similar growth and hype phase. It’ll be interesting to see how the CNCF will manage this as it brings on more projects (with Istio, the increasingly popular service mesh, being a likely candidate for coming over to the CNCF as well).

Dec
07
2018
--

IBM selling Lotus Notes/Domino business to HCL for $1.8B

IBM announced last night that it is selling the final components from its 1995 acquisition of Lotus to Indian firm HCL for $1.8 billion.

IBM paid $3.5 billion for Lotus back in the day. The big pieces here are Lotus Notes, Domino and Portal. These were a big part of IBM’s enterprise business for a long time, but last year Big Blue began to pull away, selling the development part to HCL, while maintaining control of sales and marketing.

This announcement marks the end of the line for IBM involvement. With the development of the platform out of its control, and in need of cash after spending $34 billion for Red Hat, perhaps IBM simply decided it no longer made sense to keep any part of this in-house.

As for HCL, it sees an opportunity to continue to build the Notes/Domino business, and it’s seizing it with this purchase. “The large-scale deployments of these products provide us with a great opportunity to reach and serve thousands of global enterprises across a wide range of industries and markets,” C Vijayakumar, president and CEO at HCL Technologies, said in a statement announcing the deal.

Alan Lepofsky, an analyst at Constellation Research who keeps close watch on the enterprise collaboration space, says the sale could represent a fresh start for software that IBM hasn’t really been paying close attention to for some time. “HCL is far more interested in Notes/Domino than IBM has been for a decade. They are investing heavily, trying to rejuvenate the brand,” Lepofsky told TechCrunch.

While this software may feel long in the tooth, Notes and Domino are still in use in many corners of the enterprise, and this is especially true in EMEA (Europe, Middle East and Africa) and AP (Asia Pacific), Lepofsky said.

He added that IBM appears to be completely exiting the collaboration space with this sale. “It appears that IBM is done with collaboration, out of the game,” he said.

This move makes sense for IBM, which is moving in a different direction as it develops its cloud business. The Red Hat acquisition in October, in particular, shows that the company wants to embrace private and hybrid cloud deployments, and older software like Lotus Notes and Domino don’t really play a role in that world.

The deal, which is subject to regulatory approval processes, is expected to close in the middle of next year.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com