Jul
24
2018
--

Outlier raises $6.2 M Series A to change how companies use data

Traditionally, companies have gathered data from a variety of sources, then used spreadsheets and dashboards to try and make sense of it all. Outlier wants to change that and deliver a handful of insights right to your inbox that matter most for your job, company and industry. Today the company announced a $6.2 million Series A to further develop that vision.

The round was led by Ridge Ventures with assistance from 11.2 Capital, First Round Capital, Homebrew, Susa Ventures and SV Angel. The company has raised over $8 million.

The startup is trying to solve a difficult problem around delivering meaningful insight without requiring the customer to ask the right questions. With traditional BI tools, you get your data and you start asking questions and seeing if the data can give you some answers. Outlier wants to bring a level of intelligence and automation by pointing out insight without having to explicitly ask the right question.

Company founder and CEO Sean Byrnes says his previous company, Flurry, helped deliver mobile analytics to customers, but in his travels meeting customers in that previous iteration, he always came up against the same question: “This is great, but what should I look for in all that data?”

It was such a compelling question that after he sold Flurry in 2014 to Yahoo for more than $200 million, that question stuck in the back of his mind and he decided to start a business to solve it. He contends that the first 15 years of BI was about getting answers to basic questions about company performance, but the next 15 will be about finding a way to get the software to ask good questions based on the huge amounts of data.

Byrnes admits that when he launched, he didn’t have much sense of how to put this notion into action, and most people he approached didn’t think it was a great idea. He says he heard “No” from a fair number of investors early on because the artificial intelligence required to fuel a solution like this really wasn’t ready in 2015 when he started the company.

He says that it took four or five iterations to get to today’s product, which lets you connect to various data sources, and using artificial intelligence and machine learning delivers a list of four or five relevant questions to the user’s email inbox that points out data you might not have noticed, what he calls “shifts below the surface.” If you’re a retailer that could be changing market conditions that signal you might want to change your production goals.

Outlier email example. Photo: Outlier

The company launched in 2015. It took some time to polish the product, but today they have 14 employees and 14 customers including Jack Rogers, Celebrity Cruises and Swarovski.

This round should allow them to continuing working to grow the company. “We feel like we hit the right product-market fit because we have customers [generating] reproducible results and really changing the way people use the data,” he said.

Jul
18
2018
--

Swim.ai raises $10M to bring real-time analytics to the edge

Once upon a time, it looked like cloud-based serviced would become the central hub for analyzing all IoT data. But it didn’t quite turn out that way because most IoT solutions simply generate too much data to do this effectively and the round-trip to the data center doesn’t work for applications that have to react in real time. Hence the advent of edge computing, which is spawning its own ecosystem of startups.

Among those is Swim.ai, which today announced that it has raised a $10 million Series B funding round led by Cambridge Innovation Capital, with participation from Silver Creek Ventures and Harris Barton Asset Management. The round also included a strategic investment from Arm, the chip design firm you may still remember as ARM (but don’t write it like that or their PR department will promptly email you). This brings the company’s total funding to about $18 million.

Swim.ai has an interesting take on edge computing. The company’s SWIM EDX product combines both local data processing and analytics with local machine learning. In a traditional approach, the edge devices collect the data, maybe perform some basic operations against the data to bring down the bandwidth cost and then ship it to the cloud where the hard work is done and where, if you are doing machine learning, the models are trained. Swim.ai argues that this doesn’t work for applications that need to respond in real time. Swim.ai, however, performs the model training on the edge device itself by pulling in data from all connected devices. It then builds a digital twin for each one of these devices and uses that to self-train its models based on this data.

“Demand for the EDX software is rapidly increasing, driven by our software’s unique ability to analyze and reduce data, share new insights instantly peer-to-peer – locally at the ‘edge’ on existing equipment. Efficiently processing edge data and enabling insights to be easily created and delivered with the lowest latency are critical needs for any organization,” said Rusty Cumpston, co-founder and CEO of Swim.ai. “We are thrilled to partner with our new and existing investors who share our vision and look forward to shaping the future of real-time analytics at the edge.”

The company doesn’t disclose any current customers, but it is focusing its efforts on manufacturers, service providers and smart city solutions. Update: Swim.ai did tell us about two customers after we published this story: The City of Palo Alto and Itron.

Swim.ai plans to use its new funding to launch a new R&D center in Cambridge, UK, expand its product development team and tackle new verticals and geographies with an expanded sales and marketing team.

Apr
23
2018
--

Tableau gets new pricing plans and a data preparation tool

Data analytics platform Tableau today announced the launch of both a new data preparation product and a new subscription pricing plan.

Currently, Tableau offers desktop plans for users who want to analyze their data locally, a server plan for businesses that want to deploy the service on-premises or on a cloud platform, and a fully hosted online plan. Prices for these range from $35 to $70 per user and month. The new pricing plans don’t focus so much on where the data is analyzed but on the analyst’s role. The new Creator, Explorer and Viewer plans are tailored toward the different user experiences. They all include access to the new Tableau Prep data preparation tool, Tableau Desktop and new web authoring capabilities — and they are available both on premises or in the cloud.

Existing users can switch their server or desktop subscriptions to the new release today and then assign each user either a creator, explorer or viewer role. As the name indicates, the new viewer role is meant for users who mostly consume dashboards and visualizations, but don’t create their own. The explorer role is for those who need access to a pre-defined data set and the creator role is for analysts and power user who need access to all of Tableau’s capabilities.

“Organizations are facing the urgent need to empower their entire workforce to help drive more revenue, reduce costs, provide better service, increase productivity, discover the next scientific breakthrough and even save lives,” said Adam Selipsky, CEO at Tableau, in today’s announcement. “Our new offerings will help entire organizations make analytics ubiquitous, enabling them to tailor the capabilities required for every employee.”

As for the new data preparation tool, the general idea here is to give users a visual way to shape and clean their data, something that’s especially important as businesses now often pull in data from a variety of sources. Tableau Prep can automate some of this, but the most important aspect of the service is that it gives users a visual interface for creating these kind of workflows. Prep includes support for all the standard Tableau data connectors and lets users perform calculations, too.

“Our customers often tell us that they love working with Tableau, but struggle when data is in the wrong shape for analysis,” said Francois Ajenstat, Chief Product Officer at Tableau. “We believe data prep and data analysis are two sides of the same coin that should be deeply integrated and look forward to bringing fun, easy data prep to everyone regardless of technical skill set.”

Apr
05
2018
--

Google Cloud gives developers more insights into their networks

Google Cloud is launching a new feature today that will give its users a new way to monitor and optimize how their data flows between their servers in the Google Cloud and other Google Services, on-premises deployments and virtually any other internet endpoint. As the name implies, VPC Flow Logs are meant for businesses that already use Google’s Virtual Private Cloud features to isolate their resources from other users.

VPC Flow Logs monitors and logs all the network flows (both UDP and TCP) that are sent from and received by the virtual machines inside a VPC, including traffic between Google Cloud regions. All of that data can be exported to Stackdriver Logging or BigQuery, if you want to keep it in the Google Cloud, or you can use Cloud Pub/Sub to export it to other real-time analytics or security platforms. The data updates every five seconds and Google promises that using this service has no impact on the performance of your deployed applications.

As the company notes in today’s announcement, this will allow network operators to get far more insight into the details of how the Google network performs and to troubleshoot issues if they arise. In addition, it will allow them to optimize their network usage and costs by giving them more information about their global traffic.

All of this data is also quite useful for performing forensics when it looks like somebody may have gotten into your network, too. If that’s your main use case, though, you probably want to export your data to a specialized security information and event management (SIEM) platform from vendors like Splunk or ArcSight.

Mar
07
2018
--

S&P Global snares Kensho for $550 million

S&P Global announced today that it will acquire Kensho, a Cambridge, Massachusetts startup that has concentrated on artificial intelligence and analytics for big financial institutions. The total value of the deal is $550 million in a mix of cash and stock.

Kensho, which counted S&P Global as a client/partner and an investor, launched in 2013 and has raised $67.5 million, according to Crunchbase. The most recent funding round was in fact led by S&P Global for $50 million in February 2017. They apparently liked Kensho so much, they bought the company.

“In just a short amount of time, Kensho’s intuitive platforms, sophisticated algorithms and machine learning capabilities have established a wide following throughout Wall Street and the technology world,” S&P global president and CEO Douglas Peterson said in a statement announcing the deal.

The company doesn’t have small goals. Its stated mission involves solving some of the biggest analytical problems of our time — no pressure or anything. In additional to financial services, the company also has a division called Koto, which concentrates on national security.

As you would expect in a deal like this, Kensho sees S&P Global providing it with financial resources it couldn’t provide alone through conventional funding channels. To solve those big artificial intelligence problems requires world-class engineers, and that requires an owner or investor with deep pockets. They got that with today’s announcement.

The good news for Kensho and its customers is that S&P Global intends to mostly leave it alone and let it do what it’s been doing. It will continue to operate as an independent brand out of its Cambridge offices.

Per usual, the deal is going to be subject to regulatory approval before it closes, but it’s not every day you have a company be your client, your investor and your owner, but that’s what happened to Kensho today as it scored the investment hat trick.

Mar
05
2018
--

Salesforce introduces automated query building feature in Einstein Analytics

 Salesforce introduced a new feature to Einstein Analytics today called ‘Conversational Queries’. It’s not conversational in the Alexa sense of conversation, at least not yet, but it does recognize the most common phrasing as you’re typing, providing an automated way to build queries and access data. “Now, with Conversational Queries, users can type phrases related… Read More

Feb
05
2018
--

Mixpanel analytics accidentally slurped up passwords

 The passwords of some people using sites monitored by popular analytics provider Mixpanel were mistakenly pulled into its software. Until TechCrunch’s inquiry, Mixpanel had made no public announcement about the embarrassing error beyond quietly emailing clients about the problem. Yet some need to update to a fixed Mixpanel SDK to prevent an ongoing privacy breach. Read More

Jan
23
2018
--

Sumo Logic expands security toolset with FactorChain acquisition

 When we heard from Sumo Logic last June, the company was announcing a $75 million Series F. Today, they announced they were acquiring FactorChain, a security startup that has raised $3.6 million. The companies would not disclose the purchase price, but indicated the acquisition closed at the end of Q4 and all 12 FactorChain employees have joined Sumo Logic, including CEO Dave Frampton and CTO… Read More

Jan
17
2018
--

Google and Salesforce unveil first elements of partnership

 Last fall at Dreamforce, Google and Salesforce announced a partnership. Today, the two companies unveiled the first pieces of that agreement. For starters, Google Analytics 360 users can now import data from the Salesforce CRM tool such as leads and opportunities, among other pieces. This could allow marketers to have a more complete view of the customer journey from first contact to sale… Read More

Sep
12
2017
--

Upcoming Webinar September 14, 2017: Supercharge Your Analytics with ClickHouse

ClickHouse

ClickHouseJoin Percona’s CTO Vadim Tkachenko @VadimTk and Altinity’s Co-Founder, Alexander Zaitsev as they present Supercharge Your Analytics with ClickHouse on Thursday, September 14, 2017, at 10:00 am PDT / 1:00 pm EDT (UTC-7).

 

ClickHouse is a real-time analytical database system. Even though they’re only celebrating one year as open source software, it has already proved itself ready for serious workloads.

We will talk about ClickHouse in general, some of its internals and why it is so fast. ClickHouse works in conjunction with MySQL – traditionally weak for analytical workloads – and this presentation demonstrates how to make the two systems work together.

There will also be an in-person presentation on How to Build Analytics for 100bn Logs a Month with ClickHouse at the meetup Wednesday, September 13, 2017. RSVP here.

Alexander Zaitsev will also be speaking at Percona Live Europe 2017 on Building Multi-Petabyte Data Warehouses with ClickHouse on Wednesday, September 27 at 11:30 am. Use the promo code “SeeMeSpeakPLE17” for 15% off.

Alexander ZaitsevAlexander Zaitsev
Altinity’s Co-Founder
Alexander is a co-founder of Altinity. He has 20 years of engineering and engineering management experience in several international companies. Alexander is expert in high scale analytics systems design and implementation. He designed and deployed petabyte scale data warehouses, including one of earliest ClickHouse deployments outside of Yandex.

Vadim Tkachenko
CTO
Vadim Tkachenko co-founded Percona in 2006 and serves as its Chief Technology Officer. Vadim leads Percona Labs, which focuses on technology research and performance evaluations of Percona’s and third-party products. Percona Labs designs no-gimmick tests of hardware, filesystems, storage engines, and databases that surpass the standard performance and functionality scenario benchmarks. Vadim’s expertise in LAMP performance and multi-threaded programming help optimize MySQL and InnoDB internals to take full advantage of modern hardware. Oracle Corporation and its predecessors have incorporated Vadim’s source code patches into the mainstream MySQL and InnoDB products.

He also co-authored the book High Performance MySQL: Optimization, Backups, and Replication 3rd Edition. Previously, he founded a web development company in his native Ukraine and spent two years in the High Performance Group within the official MySQL support team. Vadim received a BS in Economics and an MS in computer science from the National Technical University of Ukraine. He now lives in California with his wife and two children.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com