Feb
20
2019
--

Why Daimler moved its big data platform to the cloud

Like virtually every big enterprise company, a few years ago, the German auto giant Daimler decided to invest in its own on-premises data centers. And while those aren’t going away anytime soon, the company today announced that it has successfully moved its on-premises big data platform to Microsoft’s Azure cloud. This new platform, which the company calls eXtollo, is Daimler’s first major service to run outside of its own data centers, though it’ll probably not be the last.

As Daimler’s head of its corporate center of excellence for advanced analytics and big data Guido Vetter told me, the company started getting interested in big data about five years ago. “We invested in technology — the classical way, on-premise — and got a couple of people on it. And we were investigating what we could do with data because data is transforming our whole business as well,” he said.

By 2016, the size of the organization had grown to the point where a more formal structure was needed to enable the company to handle its data at a global scale. At the time, the buzz phrase was “data lakes” and the company started building its own in order to build out its analytics capacities.

Electric lineup, Daimler AG

“Sooner or later, we hit the limits as it’s not our core business to run these big environments,” Vetter said. “Flexibility and scalability are what you need for AI and advanced analytics and our whole operations are not set up for that. Our backend operations are set up for keeping a plant running and keeping everything safe and secure.” But in this new world of enterprise IT, companies need to be able to be flexible and experiment — and, if necessary, throw out failed experiments quickly.

So about a year and a half ago, Vetter’s team started the eXtollo project to bring all the company’s activities around advanced analytics, big data and artificial intelligence into the Azure Cloud, and just over two weeks ago, the team shut down its last on-premises servers after slowly turning on its solutions in Microsoft’s data centers in Europe, the U.S. and Asia. All in all, the actual transition between the on-premises data centers and the Azure cloud took about nine months. That may not seem fast, but for an enterprise project like this, that’s about as fast as it gets (and for a while, it fed all new data into both its on-premises data lake and Azure).

If you work for a startup, then all of this probably doesn’t seem like a big deal, but for a more traditional enterprise like Daimler, even just giving up control over the physical hardware where your data resides was a major culture change and something that took quite a bit of convincing. In the end, the solution came down to encryption.

“We needed the means to secure the data in the Microsoft data center with our own means that ensure that only we have access to the raw data and work with the data,” explained Vetter. In the end, the company decided to use the Azure Key Vault to manage and rotate its encryption keys. Indeed, Vetter noted that knowing that the company had full control over its own data was what allowed this project to move forward.

Vetter tells me the company obviously looked at Microsoft’s competitors as well, but he noted that his team didn’t find a compelling offer from other vendors in terms of functionality and the security features that it needed.

Today, Daimler’s big data unit uses tools like HD Insights and Azure Databricks, which covers more than 90 percents of the company’s current use cases. In the future, Vetter also wants to make it easier for less experienced users to use self-service tools to launch AI and analytics services.

While cost is often a factor that counts against the cloud, because renting server capacity isn’t cheap, Vetter argues that this move will actually save the company money and that storage costs, especially, are going to be cheaper in the cloud than in its on-premises data center (and chances are that Daimler, given its size and prestige as a customer, isn’t exactly paying the same rack rate that others are paying for the Azure services).

As with so many big data AI projects, predictions are the focus of much of what Daimler is doing. That may mean looking at a car’s data and error code and helping the technician diagnose an issue or doing predictive maintenance on a commercial vehicle. Interestingly, the company isn’t currently bringing to the cloud any of its own IoT data from its plants. That’s all managed in the company’s on-premises data centers because it wants to avoid the risk of having to shut down a plant because its tools lost the connection to a data center, for example.

Jan
24
2019
--

Humio raises $9M Series A for its real-time log analysis platform

Humio, a startup that provides a real-time log analysis platform for on-premises and cloud infrastructures, today announced that it has raised a $9 million Series A round led by Accel. It previously raised its seed round from WestHill and Trifork.

The company, which has offices in San Francisco, the U.K. and Denmark, tells me that it saw a 13x increase in its annual revenue in 2018. Current customers include Bloomberg, Microsoft and Netlify .

“We are experiencing a fundamental shift in how companies build, manage and run their systems,” said Humio CEO Geeta Schmidt. “This shift is driven by the urgency to adopt cloud-based and microservice-driven application architectures for faster development cycles, and dealing with sophisticated security threats. These customer requirements demand a next-generation logging solution that can provide live system observability and efficiently store the massive amounts of log data they are generating.”

To offer them this solution, Humio raised this round with an eye toward fulfilling the demand for its service, expanding its research and development teams and moving into more markets across the globe.

As Schmidt also noted, many organizations are rather frustrated by the log management and analytics solutions they currently have in place. “Common frustrations we hear are that legacy tools are too slow — on ingestion, searches and visualizations — with complex and costly licensing models,” she said. “Ops teams want to focus on operations — not building, running and maintaining their log management platform.”

To build this next-generation analysis tool, Humio built its own time series database engine to ingest the data, with open-source tools like Scala, Elm and Kafka in the backend. As data enters the pipeline, it’s pushed through live searches and then stored for later queries. As Humio VP of Engineering Christian Hvitved tells me, though, running ad-hoc queries is the exception, and most users only do so when they encounter bugs or a DDoS attack.

The query language used for the live filters is also pretty straightforward. That was a conscious decision, Hvitved said. “If it’s too hard, then users don’t ask the question,” he said. “We’re inspired by the Unix philosophy of using pipes, so in Humio, larger searches are built by combining smaller searches with pipes. This is very familiar to developers and operations people since it is how they are used to using their terminal.”

Humio charges its customers based on how much data they want to ingest and for how long they want to store it. Pricing starts at $200 per month for 30 days of data retention and 2 GB of ingested data.

Oct
28
2018
--

Forget Watson, the Red Hat acquisition may be the thing that saves IBM

With its latest $34 billion acquisition of Red Hat, IBM may have found something more elementary than “Watson” to save its flagging business.

Though the acquisition of Red Hat  is by no means a guaranteed victory for the Armonk, N.Y.-based computing company that has had more downs than ups over the five years, it seems to be a better bet for “Big Blue” than an artificial intelligence program that was always more hype than reality.

Indeed, commentators are already noting that this may be a case where IBM finally hangs up the Watson hat and returns to the enterprise software and services business that has always been its core competency (albeit one that has been weighted far more heavily on consulting services — to the detriment of the company’s business).

Watson, the business division focused on artificial intelligence whose public claims were always more marketing than actually market-driven, has not performed as well as IBM had hoped and investors were losing their patience.

Critics — including analysts at the investment bank Jefferies (as early as one year ago) — were skeptical of Watson’s ability to deliver IBM from its business woes.

As we wrote at the time:

Jefferies pulls from an audit of a partnership between IBM Watson and MD Anderson as a case study for IBM’s broader problems scaling Watson. MD Anderson cut its ties with IBM after wasting $60 million on a Watson project that was ultimately deemed, “not ready for human investigational or clinical use.”

The MD Anderson nightmare doesn’t stand on its own. I regularly hear from startup founders in the AI space that their own financial services and biotech clients have had similar experiences working with IBM.

The narrative isn’t the product of any single malfunction, but rather the result of overhyped marketing, deficiencies in operating with deep learning and GPUs and intensive data preparation demands.

That’s not the only trouble IBM has had with Watson’s healthcare results. Earlier this year, the online medical journal Stat reported that Watson was giving clinicians recommendations for cancer treatments that were “unsafe and incorrect” — based on the training data it had received from the company’s own engineers and doctors at Sloan-Kettering who were working with the technology.

All of these woes were reflected in the company’s latest earnings call where it reported falling revenues primarily from the Cognitive Solutions business, which includes Watson’s artificial intelligence and supercomputing services. Though IBM chief financial officer pointed to “mid-to-high” single digit growth from Watson’s health business in the quarter, transaction processing software business fell by 8% and the company’s suite of hosted software services is basically an afterthought for business gravitating to Microsoft, Alphabet, and Amazon for cloud services.

To be sure, Watson is only one of the segments that IBM had been hoping to tap for its future growth; and while it was a huge investment area for the company, the company always had its eyes partly fixed on the cloud computing environment as it looked for areas of growth.

It’s this area of cloud computing where IBM hopes that Red Hat can help it gain ground.

“The acquisition of Red Hat is a game-changer. It changes everything about the cloud market,” said Ginni Rometty, IBM Chairman, President and Chief Executive Officer, in a statement announcing the acquisition. “IBM will become the world’s number-one hybrid cloud provider, offering companies the only open cloud solution that will unlock the full value of the cloud for their businesses.”

The acquisition also puts an incredible amount of marketing power behind Red Hat’s various open source services business — giving all of those IBM project managers and consultants new projects to pitch and maybe juicing open source software adoption a bit more aggressively in the enterprise.

As Red Hat chief executive Jim Whitehurst told TheStreet in September, “The big secular driver of Linux is that big data workloads run on Linux. AI workloads run on Linux. DevOps and those platforms, almost exclusively Linux,” he said. “So much of the net new workloads that are being built have an affinity for Linux.”

Oct
10
2018
--

Zenefits’ Parker Conrad returns with Rippling to kill HR & IT busywork

Parker Conrad likes to save time, even though it’s gotten him in trouble. The former CEO of Zenefits was pushed out of the $4.5 billion human resources startup because he built a hack that let him and employees get faster insurance certifications. But 2.5 years later, he’s back to take the busy work out of staff onboarding as well as clumsy IT services like single sign-on to enterprise apps. Today his startup Rippling launches its combined employee management system, which Conrad calls a much larger endeavor than the minimum viable product it announced while in Y Combinator’s accelerator 18 months ago.

“It’s not an HR system. It’s a level below that,” Conrad tells me. “It’s this unholy, crazy mashup of three different things.” First, it handles payroll, benefits, taxes and PTO across all 50 states. “Except Syria and North Korea, you can pay anyone in the world with Rippling,” Conrad claims. That makes it a competitor with Gusto… and Zenefits.

Second, it’s a replacement for Okta, Duo and other enterprise single-sign on security apps that authenticate staffers across partnered apps. Rippling bookmarklets make it easy to auth into over 250 workplace apps, like Gmail, Slack, Dropbox, Asana, Trello, AWS, Salesforce, GitHub and more. When an employee is hired or changes teams, a single modification to their role in Rippling automatically changes all the permissions of what they can access.

And third, it handles computer endpoint security like Jamf. When an employee is hired, Rippling can instantly ship them a computer with all the right software installed and the hard drive encrypted, or have staffers add the Rippling agent that enforces the company’s security standards. The system is designed so there’s no need for an expert IT department to manage it.

“Distributed, fragmented systems of record for employee data are secretly the cause of almost all the annoying administrative work of running a company,” Conrad explains. “If you could build this system that ties all of it together, you could eliminate all this crap work.” That’s Rippling. It’s opening up to all potential clients today, charging them a combined subscription or à la carte fees for any of the three wings of the product.

Conrad refused to say how much Rippling has raised total, citing the enhanced scrutiny Zenefits’ raises drew. But he says a Wall Street Journal report that Rippling had raised $7 million was inaccurate. “We haven’t raised any priced VC rounds. Just a bunch of seed money. We raised from Initialized Capital, almost all the early seed investors at Zenefits and a lot of individuals.” He cited Y Combinator, YC Growth Fund, YC’s founder Jessica Livingston and president Sam Altman, other YC partners, as well as DFJ and SV Angel.

“Because we were able to raise a bunch of money and court great engineers . . . we were able to spend a lot of time building this fundamental technology,” Conrad tells me. Rippling has about 50 team members now, with about 40 of them being engineers, highlighting just how thoroughly Conrad wants to eradicate manual work about work, starting with his own startup.

The CEO refused to discuss details of exactly what went down at Zenefits and whether he thought his ejection was fair. He was accused of allowing Zenefits’ insurance brokers to sell in states where they weren’t licensed, and giving some employees a macro that let them more quickly pass the online insurance certification exam. Conrad ended up paying about $534,000 in SEC fines. Zenefits laid off 430 employees, or 45 percent of its staff, and moved to selling software to small-to-medium sized businesses through a network of insurance brokers.

But when asked what he’d learned from Zenefits, Conrad looked past those troubles and instead recalled that “one of the mistakes that we made was that we did a lot stuff manually behind the scenes. When you scale up, there are these manual processes, and it’s really hard to come back later when it’s a big hard complicated thing and replace it with technology. You get upside down on margins. If you start at the beginning and never let the manual processes creep in . . . it sort of works.”

Perhaps it was trying to cut corners that got Conrad into the Zenefits mess, but now that same intention has inspired Rippling’s goal of eliminating HR and IT drudgery with an all-in-one tool.

“I think I’m someone who feels the pain of that kind of stuff particularly strongly. So that’s always been a real irritant to me, and I saw this problem. The conventional wisdom is ‘don’t build something like this, start with something much smaller,’ ” Conrad concludes. “But I knew if I didn’t do this, that no one else was gong to do it and I really wanted this system to exist. This is a company that’s all about annoying stuff and making that fucking annoying stuff go away.”

Feb
14
2017
--

Gamalon leverages the work of an 18th century reverend to organize unstructured enterprise data

Red and white dice casting shadows on grey surface It’s hard to fathom that the work of Reverend Thomas Bayes is still coming back to drive cutting edge advancements in AI, but that’s exactly what’s happening. DARPA-backed Gamalon is the latest carrier of the Bayesian baton, launching today with a solution to help enterprises better manage their gnarly unstructured data. The world of enterprise is full of… Read More

Dec
22
2016
--

Xplenty raises another $4 million to help you integrate all your data

Network of light streams in cloudy sky The internet has changed a lot over the last two decades, but many companies are still using legacy technologies to extract, transform and load their data into warehouses. One new entrant, Xplenty, is hoping that its fresh approach, prioritizing cloud services, will provide a solid foothold in the massive market for data integration tools. Having grown to serve over 100 customers, Xplenty… Read More

Jul
25
2015
--

Software For The Full-Stack Era

newspapers The history of software is dominated by companies that automated how the biggest companies in the world did business. IBM automated clerical tasks, SAP unified corporate financials, and Siebel digitized Rolodexes for relationship-driven salespeople. With that focus in mind, it’s no wonder that many of the world’s largest technology companies survive based on their ties to IT… Read More

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com