Feb
20
2019
--

Why Daimler moved its big data platform to the cloud

Like virtually every big enterprise company, a few years ago, the German auto giant Daimler decided to invest in its own on-premises data centers. And while those aren’t going away anytime soon, the company today announced that it has successfully moved its on-premises big data platform to Microsoft’s Azure cloud. This new platform, which the company calls eXtollo, is Daimler’s first major service to run outside of its own data centers, though it’ll probably not be the last.

As Daimler’s head of its corporate center of excellence for advanced analytics and big data Guido Vetter told me, the company started getting interested in big data about five years ago. “We invested in technology — the classical way, on-premise — and got a couple of people on it. And we were investigating what we could do with data because data is transforming our whole business as well,” he said.

By 2016, the size of the organization had grown to the point where a more formal structure was needed to enable the company to handle its data at a global scale. At the time, the buzz phrase was “data lakes” and the company started building its own in order to build out its analytics capacities.

Electric lineup, Daimler AG

“Sooner or later, we hit the limits as it’s not our core business to run these big environments,” Vetter said. “Flexibility and scalability are what you need for AI and advanced analytics and our whole operations are not set up for that. Our backend operations are set up for keeping a plant running and keeping everything safe and secure.” But in this new world of enterprise IT, companies need to be able to be flexible and experiment — and, if necessary, throw out failed experiments quickly.

So about a year and a half ago, Vetter’s team started the eXtollo project to bring all the company’s activities around advanced analytics, big data and artificial intelligence into the Azure Cloud, and just over two weeks ago, the team shut down its last on-premises servers after slowly turning on its solutions in Microsoft’s data centers in Europe, the U.S. and Asia. All in all, the actual transition between the on-premises data centers and the Azure cloud took about nine months. That may not seem fast, but for an enterprise project like this, that’s about as fast as it gets (and for a while, it fed all new data into both its on-premises data lake and Azure).

If you work for a startup, then all of this probably doesn’t seem like a big deal, but for a more traditional enterprise like Daimler, even just giving up control over the physical hardware where your data resides was a major culture change and something that took quite a bit of convincing. In the end, the solution came down to encryption.

“We needed the means to secure the data in the Microsoft data center with our own means that ensure that only we have access to the raw data and work with the data,” explained Vetter. In the end, the company decided to use the Azure Key Vault to manage and rotate its encryption keys. Indeed, Vetter noted that knowing that the company had full control over its own data was what allowed this project to move forward.

Vetter tells me the company obviously looked at Microsoft’s competitors as well, but he noted that his team didn’t find a compelling offer from other vendors in terms of functionality and the security features that it needed.

Today, Daimler’s big data unit uses tools like HD Insights and Azure Databricks, which covers more than 90 percents of the company’s current use cases. In the future, Vetter also wants to make it easier for less experienced users to use self-service tools to launch AI and analytics services.

While cost is often a factor that counts against the cloud, because renting server capacity isn’t cheap, Vetter argues that this move will actually save the company money and that storage costs, especially, are going to be cheaper in the cloud than in its on-premises data center (and chances are that Daimler, given its size and prestige as a customer, isn’t exactly paying the same rack rate that others are paying for the Azure services).

As with so many big data AI projects, predictions are the focus of much of what Daimler is doing. That may mean looking at a car’s data and error code and helping the technician diagnose an issue or doing predictive maintenance on a commercial vehicle. Interestingly, the company isn’t currently bringing to the cloud any of its own IoT data from its plants. That’s all managed in the company’s on-premises data centers because it wants to avoid the risk of having to shut down a plant because its tools lost the connection to a data center, for example.

Feb
07
2019
--

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Sep
12
2018
--

Sisense hauls in $80M investment as data analytics business matures

Sisense, a company that helps customers understand and visualize their data across multiple sources, announced an $80 million Series E investment today led by Insight Venture Partners. They also announced that Zack Urlocker, former COO at Duo Security and Zendesk, has joined the organization’s board of directors.

The company has attracted a prestigious list of past investors, who also participated in the round, including Battery Ventures, Bessemer Venture Partners, DFJ Venture Capital, Genesis Partners and Opus Capital. Today’s investment brings the total raised to close to $200 million.

CEO Amir Orad says investors like their mission of simplifying complex data with analytics and business intelligence and delivering it in whatever way makes sense. That could be on screens throughout the company, desktop or smartphone, or via Amazon Alexa. “We found a way to make accessing data extremely simple, mashing it together in a logical way and embedding it in every logical place,” he explained.

It appears to be resonating. The company has over 1000 customers including Expedia, Oppenheimer and Phillips to name but a few. Orad says they are actually the analytics engine behind Nasdaq Corporate Solutions, which is the the main investor relations system used by CFOs.

He was not in the mood to discuss the company’s valuation, an exercise he called “an ego boost he doesn’t relate to.” He says that he would prefer to be measured by how efficiently he uses the money investors give him or by customer satisfaction scores. Nor would he deal with IPO speculation. All he would say on that front was, “When you focus on the value you bring, positive things happen.”

In spite of that, he was clearly excited about having Urlocker join the board. He says the two spent six months getting to know each other and he sees a guy who has brought several companies to successful exit joining his team, and perhaps someone who can help him bring his company across the finish line, however that ultimately happens. Just last month, Cisco bought Urlocker’s former company, Duo Security for $2.35 billion.

For now Sisense, which launched in 2010, has another $80 million in the bank. They plan to add to the nearly 500 employees already in place in offices in New York, Tel Aviv, Kiev, Tokyo and Arizona. In particular, they plan to grow their international presence more aggressively, especially adding employees to help with customer success and field engineering. Orad also said that he was also open to acquiring companies should the right opportunity come along, saying “Because of talent, technology and presence, it’s something you have to be on lookout for.”

When a company reaches Series E and a couple of hundred million raised, it’s often a point where an exit could be coming sooner than later. By adding an experienced executive like Urlocker, it just emphasizes that possibility, but for now the company appears to be growing and thriving, and taking the view that whatever will be, will be.

Sep
11
2018
--

Twilio’s contact center products just got more analytical with Ytica acquisition

Twilio, a company best known for supplying a communications APIs for developers has a product called Twilio Flex for building sophisticated customer service applications on top of Twilio’s APIs. Today, it announced it was acquiring Ytica (pronounced Why-tica) to provide an operational and analytical layer on top of the customer service solution.

The companies would not discuss the purchase price, but Twilio indicated it does not expect the acquisition to have a material impact on its “results, operations or financial condition.” In other words, it probably didn’t cost much.

Ytica, which is based in Prague, has actually been a partner with Twilio for some time, so coming together in this fashion really made a lot of sense, especially as Twilio has been developing Flex.

Twilio Flex is an app platform for contact centers, which offers a full stack of applications and allows users to deliver customer support over multiple channels, Al Cook, general manager of Twilio Flex explained. “Flex deploys like SaaS, but because it’s built on top of APIs, you can reach in and change how Flex works,” he said. That is very appealing, especially for larger operations looking for a flexible, cloud-based solution without the baggage of on-prem legacy products.

What the product was lacking, however, was a native way to manage customer service representatives from within the application, and understand through analytics and dashboards, how well or poorly the team was doing. Having that ability to measure the effectiveness of the team becomes even more critical the larger the group becomes, and Cook indicated some Flex users are managing enormous groups with 10,000-20,000 employees.

Ytica provides a way to measure the performance of customer service staff, allowing management to monitor and intervene and coach when necessary. “It made so much sense to join together as one team. They have huge experience in the contact center, and a similar philosophy to build something customizable and programmable in the cloud,” Cook said.

While Ytica works with other vendors beyond Twilio, CEO Simon Vostrý says that they will continue to support those customers, even as they join the Twilio family. “We can run Flex and can continue to run this separately. We have customers running on other SaaS platforms, and we will continue to support them,” he said.

The company will remain in Prague and become a Twilio satellite office. All 14 employees are expected to join the Twilio team and Cook says plans are already in the works to expand the Prague team.

Jul
24
2018
--

Outlier raises $6.2 M Series A to change how companies use data

Traditionally, companies have gathered data from a variety of sources, then used spreadsheets and dashboards to try and make sense of it all. Outlier wants to change that and deliver a handful of insights right to your inbox that matter most for your job, company and industry. Today the company announced a $6.2 million Series A to further develop that vision.

The round was led by Ridge Ventures with assistance from 11.2 Capital, First Round Capital, Homebrew, Susa Ventures and SV Angel. The company has raised over $8 million.

The startup is trying to solve a difficult problem around delivering meaningful insight without requiring the customer to ask the right questions. With traditional BI tools, you get your data and you start asking questions and seeing if the data can give you some answers. Outlier wants to bring a level of intelligence and automation by pointing out insight without having to explicitly ask the right question.

Company founder and CEO Sean Byrnes says his previous company, Flurry, helped deliver mobile analytics to customers, but in his travels meeting customers in that previous iteration, he always came up against the same question: “This is great, but what should I look for in all that data?”

It was such a compelling question that after he sold Flurry in 2014 to Yahoo for more than $200 million, that question stuck in the back of his mind and he decided to start a business to solve it. He contends that the first 15 years of BI was about getting answers to basic questions about company performance, but the next 15 will be about finding a way to get the software to ask good questions based on the huge amounts of data.

Byrnes admits that when he launched, he didn’t have much sense of how to put this notion into action, and most people he approached didn’t think it was a great idea. He says he heard “No” from a fair number of investors early on because the artificial intelligence required to fuel a solution like this really wasn’t ready in 2015 when he started the company.

He says that it took four or five iterations to get to today’s product, which lets you connect to various data sources, and using artificial intelligence and machine learning delivers a list of four or five relevant questions to the user’s email inbox that points out data you might not have noticed, what he calls “shifts below the surface.” If you’re a retailer that could be changing market conditions that signal you might want to change your production goals.

Outlier email example. Photo: Outlier

The company launched in 2015. It took some time to polish the product, but today they have 14 employees and 14 customers including Jack Rogers, Celebrity Cruises and Swarovski.

This round should allow them to continuing working to grow the company. “We feel like we hit the right product-market fit because we have customers [generating] reproducible results and really changing the way people use the data,” he said.

Jul
18
2018
--

Swim.ai raises $10M to bring real-time analytics to the edge

Once upon a time, it looked like cloud-based serviced would become the central hub for analyzing all IoT data. But it didn’t quite turn out that way because most IoT solutions simply generate too much data to do this effectively and the round-trip to the data center doesn’t work for applications that have to react in real time. Hence the advent of edge computing, which is spawning its own ecosystem of startups.

Among those is Swim.ai, which today announced that it has raised a $10 million Series B funding round led by Cambridge Innovation Capital, with participation from Silver Creek Ventures and Harris Barton Asset Management. The round also included a strategic investment from Arm, the chip design firm you may still remember as ARM (but don’t write it like that or their PR department will promptly email you). This brings the company’s total funding to about $18 million.

Swim.ai has an interesting take on edge computing. The company’s SWIM EDX product combines both local data processing and analytics with local machine learning. In a traditional approach, the edge devices collect the data, maybe perform some basic operations against the data to bring down the bandwidth cost and then ship it to the cloud where the hard work is done and where, if you are doing machine learning, the models are trained. Swim.ai argues that this doesn’t work for applications that need to respond in real time. Swim.ai, however, performs the model training on the edge device itself by pulling in data from all connected devices. It then builds a digital twin for each one of these devices and uses that to self-train its models based on this data.

“Demand for the EDX software is rapidly increasing, driven by our software’s unique ability to analyze and reduce data, share new insights instantly peer-to-peer – locally at the ‘edge’ on existing equipment. Efficiently processing edge data and enabling insights to be easily created and delivered with the lowest latency are critical needs for any organization,” said Rusty Cumpston, co-founder and CEO of Swim.ai. “We are thrilled to partner with our new and existing investors who share our vision and look forward to shaping the future of real-time analytics at the edge.”

The company doesn’t disclose any current customers, but it is focusing its efforts on manufacturers, service providers and smart city solutions. Update: Swim.ai did tell us about two customers after we published this story: The City of Palo Alto and Itron.

Swim.ai plans to use its new funding to launch a new R&D center in Cambridge, UK, expand its product development team and tackle new verticals and geographies with an expanded sales and marketing team.

Apr
23
2018
--

Tableau gets new pricing plans and a data preparation tool

Data analytics platform Tableau today announced the launch of both a new data preparation product and a new subscription pricing plan.

Currently, Tableau offers desktop plans for users who want to analyze their data locally, a server plan for businesses that want to deploy the service on-premises or on a cloud platform, and a fully hosted online plan. Prices for these range from $35 to $70 per user and month. The new pricing plans don’t focus so much on where the data is analyzed but on the analyst’s role. The new Creator, Explorer and Viewer plans are tailored toward the different user experiences. They all include access to the new Tableau Prep data preparation tool, Tableau Desktop and new web authoring capabilities — and they are available both on premises or in the cloud.

Existing users can switch their server or desktop subscriptions to the new release today and then assign each user either a creator, explorer or viewer role. As the name indicates, the new viewer role is meant for users who mostly consume dashboards and visualizations, but don’t create their own. The explorer role is for those who need access to a pre-defined data set and the creator role is for analysts and power user who need access to all of Tableau’s capabilities.

“Organizations are facing the urgent need to empower their entire workforce to help drive more revenue, reduce costs, provide better service, increase productivity, discover the next scientific breakthrough and even save lives,” said Adam Selipsky, CEO at Tableau, in today’s announcement. “Our new offerings will help entire organizations make analytics ubiquitous, enabling them to tailor the capabilities required for every employee.”

As for the new data preparation tool, the general idea here is to give users a visual way to shape and clean their data, something that’s especially important as businesses now often pull in data from a variety of sources. Tableau Prep can automate some of this, but the most important aspect of the service is that it gives users a visual interface for creating these kind of workflows. Prep includes support for all the standard Tableau data connectors and lets users perform calculations, too.

“Our customers often tell us that they love working with Tableau, but struggle when data is in the wrong shape for analysis,” said Francois Ajenstat, Chief Product Officer at Tableau. “We believe data prep and data analysis are two sides of the same coin that should be deeply integrated and look forward to bringing fun, easy data prep to everyone regardless of technical skill set.”

Apr
05
2018
--

Google Cloud gives developers more insights into their networks

Google Cloud is launching a new feature today that will give its users a new way to monitor and optimize how their data flows between their servers in the Google Cloud and other Google Services, on-premises deployments and virtually any other internet endpoint. As the name implies, VPC Flow Logs are meant for businesses that already use Google’s Virtual Private Cloud features to isolate their resources from other users.

VPC Flow Logs monitors and logs all the network flows (both UDP and TCP) that are sent from and received by the virtual machines inside a VPC, including traffic between Google Cloud regions. All of that data can be exported to Stackdriver Logging or BigQuery, if you want to keep it in the Google Cloud, or you can use Cloud Pub/Sub to export it to other real-time analytics or security platforms. The data updates every five seconds and Google promises that using this service has no impact on the performance of your deployed applications.

As the company notes in today’s announcement, this will allow network operators to get far more insight into the details of how the Google network performs and to troubleshoot issues if they arise. In addition, it will allow them to optimize their network usage and costs by giving them more information about their global traffic.

All of this data is also quite useful for performing forensics when it looks like somebody may have gotten into your network, too. If that’s your main use case, though, you probably want to export your data to a specialized security information and event management (SIEM) platform from vendors like Splunk or ArcSight.

Mar
07
2018
--

S&P Global snares Kensho for $550 million

S&P Global announced today that it will acquire Kensho, a Cambridge, Massachusetts startup that has concentrated on artificial intelligence and analytics for big financial institutions. The total value of the deal is $550 million in a mix of cash and stock.

Kensho, which counted S&P Global as a client/partner and an investor, launched in 2013 and has raised $67.5 million, according to Crunchbase. The most recent funding round was in fact led by S&P Global for $50 million in February 2017. They apparently liked Kensho so much, they bought the company.

“In just a short amount of time, Kensho’s intuitive platforms, sophisticated algorithms and machine learning capabilities have established a wide following throughout Wall Street and the technology world,” S&P global president and CEO Douglas Peterson said in a statement announcing the deal.

The company doesn’t have small goals. Its stated mission involves solving some of the biggest analytical problems of our time — no pressure or anything. In additional to financial services, the company also has a division called Koto, which concentrates on national security.

As you would expect in a deal like this, Kensho sees S&P Global providing it with financial resources it couldn’t provide alone through conventional funding channels. To solve those big artificial intelligence problems requires world-class engineers, and that requires an owner or investor with deep pockets. They got that with today’s announcement.

The good news for Kensho and its customers is that S&P Global intends to mostly leave it alone and let it do what it’s been doing. It will continue to operate as an independent brand out of its Cambridge offices.

Per usual, the deal is going to be subject to regulatory approval before it closes, but it’s not every day you have a company be your client, your investor and your owner, but that’s what happened to Kensho today as it scored the investment hat trick.

Mar
05
2018
--

Salesforce introduces automated query building feature in Einstein Analytics

 Salesforce introduced a new feature to Einstein Analytics today called ‘Conversational Queries’. It’s not conversational in the Alexa sense of conversation, at least not yet, but it does recognize the most common phrasing as you’re typing, providing an automated way to build queries and access data. “Now, with Conversational Queries, users can type phrases related… Read More

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com