Feb
20
2019
--

Why Daimler moved its big data platform to the cloud

Like virtually every big enterprise company, a few years ago, the German auto giant Daimler decided to invest in its own on-premises data centers. And while those aren’t going away anytime soon, the company today announced that it has successfully moved its on-premises big data platform to Microsoft’s Azure cloud. This new platform, which the company calls eXtollo, is Daimler’s first major service to run outside of its own data centers, though it’ll probably not be the last.

As Daimler’s head of its corporate center of excellence for advanced analytics and big data Guido Vetter told me, the company started getting interested in big data about five years ago. “We invested in technology — the classical way, on-premise — and got a couple of people on it. And we were investigating what we could do with data because data is transforming our whole business as well,” he said.

By 2016, the size of the organization had grown to the point where a more formal structure was needed to enable the company to handle its data at a global scale. At the time, the buzz phrase was “data lakes” and the company started building its own in order to build out its analytics capacities.

Electric lineup, Daimler AG

“Sooner or later, we hit the limits as it’s not our core business to run these big environments,” Vetter said. “Flexibility and scalability are what you need for AI and advanced analytics and our whole operations are not set up for that. Our backend operations are set up for keeping a plant running and keeping everything safe and secure.” But in this new world of enterprise IT, companies need to be able to be flexible and experiment — and, if necessary, throw out failed experiments quickly.

So about a year and a half ago, Vetter’s team started the eXtollo project to bring all the company’s activities around advanced analytics, big data and artificial intelligence into the Azure Cloud, and just over two weeks ago, the team shut down its last on-premises servers after slowly turning on its solutions in Microsoft’s data centers in Europe, the U.S. and Asia. All in all, the actual transition between the on-premises data centers and the Azure cloud took about nine months. That may not seem fast, but for an enterprise project like this, that’s about as fast as it gets (and for a while, it fed all new data into both its on-premises data lake and Azure).

If you work for a startup, then all of this probably doesn’t seem like a big deal, but for a more traditional enterprise like Daimler, even just giving up control over the physical hardware where your data resides was a major culture change and something that took quite a bit of convincing. In the end, the solution came down to encryption.

“We needed the means to secure the data in the Microsoft data center with our own means that ensure that only we have access to the raw data and work with the data,” explained Vetter. In the end, the company decided to use the Azure Key Vault to manage and rotate its encryption keys. Indeed, Vetter noted that knowing that the company had full control over its own data was what allowed this project to move forward.

Vetter tells me the company obviously looked at Microsoft’s competitors as well, but he noted that his team didn’t find a compelling offer from other vendors in terms of functionality and the security features that it needed.

Today, Daimler’s big data unit uses tools like HD Insights and Azure Databricks, which covers more than 90 percents of the company’s current use cases. In the future, Vetter also wants to make it easier for less experienced users to use self-service tools to launch AI and analytics services.

While cost is often a factor that counts against the cloud, because renting server capacity isn’t cheap, Vetter argues that this move will actually save the company money and that storage costs, especially, are going to be cheaper in the cloud than in its on-premises data center (and chances are that Daimler, given its size and prestige as a customer, isn’t exactly paying the same rack rate that others are paying for the Azure services).

As with so many big data AI projects, predictions are the focus of much of what Daimler is doing. That may mean looking at a car’s data and error code and helping the technician diagnose an issue or doing predictive maintenance on a commercial vehicle. Interestingly, the company isn’t currently bringing to the cloud any of its own IoT data from its plants. That’s all managed in the company’s on-premises data centers because it wants to avoid the risk of having to shut down a plant because its tools lost the connection to a data center, for example.

Feb
19
2019
--

Google acquires cloud migration platform Alooma

Google today announced its intention to acquire Alooma, a company that allows enterprises to combine all of their data sources into services like Google’s BigQuery, Amazon’s Redshift, Snowflake and Azure. The promise of Alooma is that it handles the data pipelines and manages them for its users. In addition to this data integration service, though, Alooma also helps with migrating to the cloud, cleaning up this data and then using it for AI and machine learning use cases.

“Here at Google Cloud, we’re committed to helping enterprise customers easily and securely migrate their data to our platform,” Google VP of engineering Amit Ganesh and Google Cloud Platform director of product management Dominic Preuss write today. “The addition of Alooma, subject to closing conditions, is a natural fit that allows us to offer customers a streamlined, automated migration experience to Google Cloud, and give them access to our full range of database services, from managed open source database offerings to solutions like Cloud Spanner and Cloud Bigtable.”

Before the acquisition, Alooma had raised about $15 million, including an $11.2 million Series A round led by Lightspeed Venture Partners and Sequoia Capital in early 2016. The two companies did not disclose the price of the acquisition, but chances are we are talking about a modest price, given how much Alooma had previously raised.

Neither Google nor Alooma said much about what will happen to the existing products and customers — and whether it will continue to support migrations to Google’s competitors. We’ve reached out to Google and will update this post once we hear more.

Update. Here is Google’s statement about the future of the platform:

For now, it’s business as usual for Alooma and Google Cloud as we await regulatory approvals and complete the deal. After close, the team will be joining us in our Tel Aviv and Sunnyvale offices, and we will be leveraging the Alooma technology and team to provide our Google Cloud customers with a great data migration service in the future.

Regarding supporting competitors, yes, the existing Alooma product will continue to support other cloud providers. We will only be accepting new customers that are migrating data to Google Cloud Platform, but existing customers will continue to have access to other cloud providers.   
So going forward, Alooma will not accept any new customers who want to migrate data to any competitors. That’s not necessarily unsurprising and it’s good to see that Google will continue to support Alooma’s existing users. Those who use Alooma in combination with AWS, Azure and other non-Google services will likely start looking for other solutions soon, though, as this also means that Google isn’t likely to develop the service for them beyond its current state.

Alooma’s co-founders do stress, though, that “the journey is not over. Alooma has always aimed to provide the simplest and most efficient path toward standardizing enterprise data from every source and transforming it into actionable intelligence,” they write. “Joining Google Cloud will bring us one step closer to delivering a full self-service database migration experience bolstered by the power of their cloud technology, including analytics, security, AI, and machine learning.”

Feb
19
2019
--

GN acquires Altia Systems for $125M to add video to its advanced audio solutions

Some interesting M&A is afoot in the world of hardware and software that’s aiming to improve the quality of audio and video communications over digital networks.

GN Group — the Danish company that broke new ground in mobile when it inked deals first with Apple and then Google to stream audio from their phones directly to smart, connected hearing aids — is expanding from audio to video, and from Europe to Silicon Valley.

Today, the company announced that it would acquire Altia Systems, a startup out of Cupertino that makes a “surround” videoconferencing device and software called the PanaCast (we reviewed it oncedesigned to replicate the panoramic, immersive experience of vision that we have as humans

GN is paying $125 million for the startup. For some context, this price represents a decent return: according to PitchBook, Altia was last valued at around $78 million with investors including Intel Capital and others.

Intel’s investment was one of several strategic partnerships that Altia had inked over the years. (Another was with Zoom to provide a new video solution for Uber.)

The Intel partnership, for one, will continue post-acquisition. “Intel invested in Altia Systems to bring an industry leading immersive, Panoramic-4K camera experience to business video collaboration,” said Dave Flanagan, Vice President of Intel Corporation and Senior Managing Director of Intel Capital, in a statement. “Over the past few years, Altia Systems has collaborated with Intel to use AI and to deliver more intelligent conference rooms and business meetings. This helps customers make better decisions, automate workflows and improve business efficiency. We are excited to work with GN to further scale this technology on a global basis.”

We have seen a lot of applications of AI in just about every area of technology, but one of the less talked about, but very interesting, areas has been in how it’s being used to enhance audio in digital network. Pindrop, as one example, is creating and tracking “audio fingerprints” for security applications, specifically fraud prevention (to authenticate users and to help weed out imposters based not just on the actual voice but on all the other aural cues we may not pick up as humans but can help build a picture of a caller’s location and so on).

GN, meanwhile, has been building AI-based algorithms to help those who cannot hear as well, or who simply needs to hear better, be able to listen to calls on digital networks and make out what’s being said. This not only requires technology to optimise the audio quality, but also algorithms that can help tailor that quality to the specific person’s own unique hearing needs.

One of the more obvious applications of services like these are for those who are hard of hearing and use hearing aids (which can be awful or impossible to use with mobile phones), another is in call centers, and this appears to be the area where GN is hoping to address with the Altia acquisition.

GN already offers two products for call centre workers, Jabra and BlueParrot — headsets and speakerphones with their own proprietary software that it claims makes workers more efficient and productive just by making it easier to understand what callers are saying.

Altia will be integrated into that solution to expand it to include videoconferencing around unified communications solutions, creating more natural experiences for those who are not actually in physical rooms together.

“Combining GN Audio’s sound expertise, partner eco-system and global channel access with the video technology from Altia Systems, we will take the experience of conference calls to a completely new level,” said René Svendsen-Tune, President and CEO of GN Audio, in a statement.

What’s notable is that GN is a vertically-integrated company, building not just hardware but software to run on it. The AI engine underpinning some of its software development will be getting a vast new trove of data fed into it now by way of the PanaCast solution: not jut in terms of video, but the large amount of audio that will naturally come along with it.

“Combining PanaCast’s immersive, intelligent video with GN Audio’s intelligent audio solutions will enable us to deliver a whole new class of collaboration products for our customers,” said Aurangzeb Khan, President and CEO of Altia Systems, in a statement. “PanaCast’s solutions enable companies to improve meeting participants’ experience, automate workflows, and enhance business efficiency and real estate utilization with data lakes of valid information.”

Given GN’s work with Android and iOS devices, it will be interesting to see how and if these video solutions make their way to those platforms as well, either by way of solutions that work on their phones or perhaps more native integrations down the line.

Regardless of how that develops, what’s clear is that there remains a market not just for basic tools to get work done, but technology to improve the quality of those tools, and that’s where GN hopes it will resonate with this deal.

Feb
19
2019
--

Senseon raises $6.4M to tackle cybersecurity threats with an AI ‘triangulation’ approach

Darktrace helped pave the way for using artificial intelligence to combat malicious hacking and enterprise security breaches. Now a new U.K. startup founded by an ex-Darktrace executive has raised some funding to take the use of AI in cybersecurity to the next level.

Senseon, which has pioneered a new model that it calls “AI triangulation” — simultaneously applying artificial intelligence algorithms to oversee, monitor and defend an organization’s network appliances, endpoints and “investigator bots” covering multiple microservices — has raised $6.4 million in seed funding.

David Atkinson — the startup’s CEO and founder who had previously been the commercial director for Darktrace and before that helped pioneer new cybersecurity techniques as an operative at the U.K.’s Ministry of Defense — said that Senseon will use the funding to continue to expand its business both in Europe and the U.S. 

The deal was co-led by MMC Ventures and Mark Weatherford, who is chief cybersecurity strategist at vArmour (which itself raised money in recent weeks) and previously Deputy Under Secretary for Cybersecurity, U.S. Department of Homeland Security. Others in the round included Amadeus Capital Partners, Crane Venture Partners and CyLon, a security startup incubator in London.

As Atkinson describes it, triangulation was an analytics concept first introduced by the CIA in the U.S., a method of bringing together multiple vectors of information to unearth inconsistencies in a data set (you can read more on triangulation in this CIA publication). He saw an opportunity to build a platform that took the same kind of approach to enterprise security.

There are a number of companies that are using AI-based techniques to help defend against breaches — in addition to Darktrace, there is Hexadite (a remediation specialist acquired by Microsoft), Amazon is working in the field and many others. In fact I think you’d be hard-pressed to find any IT security company today that doesn’t claim to or actually use AI in its approach.

Atkinson claims, however, that many AI-based solutions — and many other IT security products — take siloed, single-point approaches to defending a network. That is to say, you have network appliance security products, endpoint security, perhaps security for individual microservices and so on.

But while many of these work well, you don’t always get those different services speaking to each other. And that doesn’t reflect the shape that the most sophisticated security breaches are taking today.

As cybersecurity breaches and identified vulnerabilities continue to grow in frequency and scope — with hundreds of millions of individuals’ and organizations’ data potentially exposed in the process, systems disabled, and more — we’re seeing an increasing amount of sophistication on the part of the attackers.

Yes, those malicious actors employ artificial intelligence. But — as described in this 2019 paper on the state of cybersecurity from Symantec — they are also taking advantage of bigger “surface areas” with growing networks of connected objects all up for grabs; and they are tackling new frontiers like infiltrating data in transport and cloud-based systems. (In terms of examples of new frontiers, mobile networks, biometric data, gaming networks, public clouds and new card-skimming techniques are some of the specific areas that Experian calls out.)

Senseon’s antidote has been to build a new platform that “emulates how analysts think,” said Atkinson. Looking at an enterprise’s network appliance, an endpoint and microservices in the cloud, the Senseon platform “has an autonomous conversation” using the source data, before it presents a conclusion, threat, warning or even breach alert to the organization’s security team.

“We have an ability to take observations and compare that to hypothetical scenarios. When we tell you something, it has a rich context,” he said. Single-point alternatives essentially can create “blind spots that hackers manoeuvre around. Relying on single-source intelligence is like tying one hand behind your back.”

After Senseon compiles its data, it sends out alerts to security teams in a remediation service. Interestingly, while the platform’s aim is to identify malicious activity in a network, another consequence of what it’s doing is to help organizations identify “false positives” that are not actually threats, to cut down on time and money that get wasted on investigating those.

“Organisations of all sizes need to get better at keeping pace with emerging threats, but more importantly, identifying the attacks that require intervention,” said Mina Samaan of MMC Ventures in a statement. “Senseon’s technology directly addresses this challenge by using reinforcement learning AI techniques to help over-burdened security teams better understand anomalous behaviour through a single holistic platform.”

Although Senseon is only announcing seed funding today, the company has actually been around since 2017 and already has customers, primarily in the finance and legal industries (it would only give out one customer reference, the law firm of Harbottle & Lewis).

Feb
14
2019
--

Peltarion raises $20M for its AI platform

Peltarion, a Swedish startup founded by former execs from companies like Spotify, Skype, King, TrueCaller and Google, today announced that it has raised a $20 million Series A funding round led by Euclidean Capital, the family office for hedge fund billionaire James Simons. Previous investors FAM and EQT Ventures also participated, and this round brings the company’s total funding to $35 million.

There is obviously no dearth of AI platforms these days. Peltarion focus on what it calls “operational AI.” The service offers an end-to-end platform that lets you do everything from pre-processing your data to building models and putting them into production. All of this runs in the cloud and developers get access to a graphical user interface for building and testing their models. All of this, the company stresses, ensures that Peltarion’s users don’t have to deal with any of the low-level hardware or software and can instead focus on building their models.

“The speed at which AI systems can be built and deployed on the operational platform is orders of magnitude faster compared to the industry standard tools such as TensorFlow and require far fewer people and decreases the level of technical expertise needed,” Luka Crnkovic-Friis, of Peltarion’s CEO and co-founder, tells me. “All this results in more organizations being able to operationalize AI and focusing on solving problems and creating change.”

In a world where businesses have a plethora of choices, though, why use Peltarion over more established players? “Almost all of our clients are worried about lock-in to any single cloud provider,” Crnkovic-Friis said. “They tend to be fine using storage and compute as they are relatively similar across all the providers and moving to another cloud provider is possible. Equally, they are very wary of the higher-level services that AWS, GCP, Azure, and others provide as it means a complete lock-in.”

Peltarion, of course, argues that its platform doesn’t lock in its users and that other platforms take far more AI expertise to produce commercially viable AI services. The company rightly notes that, outside of the tech giants, most companies still struggle with how to use AI at scale. “They are stuck on the starting blocks, held back by two primary barriers to progress: immature patchwork technology and skills shortage,” said Crnkovic-Friis.

The company will use the new funding to expand its development team and its teams working with its community and partners. It’ll also use the new funding for growth initiatives in the U.S. and other markets.

Feb
14
2019
--

Zoho’s office suite gets smarter

As far as big tech companies go, Zoho is a bit different. Not only has it never taken any venture funding, it also offers more than 40 products that range from its online office suite to CRM and HR tools, email, workflow automation services, video conferencing, a bug tracker and everything in-between. You don’t often hear about it, but the company has more than 45 million users worldwide and offices in the U.S., Netherlands, Singapore, Dubai, Yokohama and Beijing — and it owns its data centers, too.

Today, Zoho is launching a major update to its core office suite products: Zoho Writer, Sheet, Show and Notebooks. These tools are getting an infusion of AI — under Zoho’s “Zia” brand — as well as new AppleTV and Android integrations and more. All of the tools are getting some kind of AI-based feature or another, but they are also getting support for Zia Voice, Zoho’s conversational AI assistant.

With this, you can now ask questions about data in your spreadsheets, for example, and Zia will create charts and even pivot tables for you. Similarly, Zoho is using Zia in its document editor and presentation tools to provide better grammar and spellchecking tools (and it’ll now offer a readability score and tips for improving your text). In Zoho Notebook, the note-taking application that is also the company’s newest app, Zia can help users create different formats for their note cards based on the content (text, photo, audio, checklist, sketch, etc.).

“We want to make AI helpful in a very contextual manner for a specific application,” Raju Vegesna, Zoho’s chief evangelist, told me. “Because we do AI across the board, we learned a lot and were are able to apply learnings on one technology and one piece of context and apply that to another.” Zoho first brought Zia to its business intelligence app, for example, and now it’s essentially bringing the same capabilities to its spreadsheet app, too.

It’s worth noting that Google and Microsoft are doing similar things with their productivity apps, too, of course. Zoho, however, argues that it offers a far wider range of applications — and its stated mission is that you should be able to run your entire business on its platform. And the plan is to bring some form of AI to all of them. “Fast-forward a few months and [our AI grammar and spellchecker] is applied to the business application context — maybe a support agent responding to a customer ticket can use this technology to make sure there are no typos in those responses,” Vegesna said.

There are plenty of other updates in this release, too. Zoho Show now works with AppleTV-enabled devices for example, and Android users can now use their phones as a smart remote for Show. Zoho Sheet now lets you build custom functions and scripts and Zoho Writer’s web, mobile and iPad versions can now work completely offline.

The broader context here, though, is that Zoho, with its ridiculously broad product portfolio, is playing a long game. The company has no interest in going public. But it also knows that it’s going up against companies like Google and Microsoft. “Vertical integration is not something that you see in our industry,” said Vegesna. “Companies are in that quick mode of getting traction, sell or go public. We are looking at it in the 10 to 20-year time frame. To really win that game, you need to make these serious investments in the market. The improvements you are seeing here are at the surface level. But we don’t see ourselves as a software company. We see ourselves as a technology company.” And to build up these capabilities, Vegesna said, Zoho has invested hundreds of millions of dollars into its own data centers in the U.S., Europe and Asia, for example.

Feb
12
2019
--

Google takes Hire, its G Suite recruitment platform, to its first global markets, UK and Canada

The recruitment market is big business — worth some $554 billion annually according to the most recent report from the World Employment Confederation. In the tech world, that translates into a big opportunity to build tools to make a recruiter’s work easier, faster and more likely of success in finding the right people for the job. Now Google is stepping up its own efforts in the space: today it is expanding Hire, its G Suite-based recruitment management platform, to the UK and Canada, its first international markets outside the US.

Google is a somewhat late entrant into the market, launching Hire only in 2017 with the basic ability to use apps like Gmail, Calendar, Spreadsheets and Google Voice to help people manage and track candidates through the recruiting process and doing so by integrating with third-party job boards. In the interim, it has supercharged the service with bells and whistles that draw on the company’s formidable IP in areas like AI and search.

These tools provide robotic process automation-style aids to take away some of the more repetitive tasks around admin.

“Recruiters want time to talk to candidates but they don’t want to sit in systems clicking things,” said Dmitri Krakovsky, the VP leads Hire for Google. “We give time back by automating a lot of functionality.” They also sift out needles in haystacks of applicants and surface interesting “lookalikes” who didn’t quite make the cut (or take the job) so that they can be targeted for future opportunities.

And — naturally — while Hire links up with third-party job boards via services like eQuest to bring inbound people into the system, it also provides seamless integration with Jobs by Google, Google’s own vertical search effort that is taking on the traditional job board by letting people look for opportunities with natural language queries in Google’s basic search window

Krakovsky said that the first international launches in Canada and the UK made sense because of the lack of language barrier between them and the US. The UK was key for another reason, too: it gave Google the chance to tweak the product to comply with GDPR, he said, for future launches.

While markets like the UK and US represent some of the very biggest for recruitment services globally, it’s a long tail opportunity, and over time, the ambition will be to take Hire global, positioning it as a key rival against the likes of Taleo, LinkedIn, Jobvite, Zoho, SmartRecruiter and many others in the area of applicant sourcing and tracking.

Currently, Hire ranks only at number 23 among the top 100 applicant tracking systems globally, according to research from OnGig, but it also singles it out for its potential because it is, after all, Google. For now, Krakovsky said it’s not taking on large enterprises or even tiny mom-and-pop shops, but the very large opportunity of between 10 and a couple of thousand employees.

The bigger opportunity for Google is on a couple of levels. First, it sells Hire as a paid product that helps bolster the company’s wider offering of Google Cloud Platform software and services. These prices range from $100/month to $400/month depending on company size (and you work directly with Google on pricing if your organization is over 100 employees). Second, it bolsters the company’s wider ambitions in recruitment, which also include the API-based Cloud Talent Solutions and its vertical search job boards. It’s a quiet but huge strategy, considering the size of the market that is being tackled.

Google’s supercharging of Hire with AI and taking it international highlights another point. One of the biggest meta-trends in recruitment has been a push to try to hire with more diversity in mind, not just to bring fairness to the process, but to infuse businesses with different ways of thinking and catering to different audiences.

While AI is something that can definitely speed up certain processes, it has also been shown to be a potential cesspool of bias based on what is fed into it. One particularly messy example of that, in fact, came from an attempt by Amazon to build an AI-based recruitment tool, which it eventually had to shut down.

Google is well aware of that and has been keeping it in mind when building and expanding Hire particularly to new territories, which in themselves are exercises in handling diversity for AI systems.

Krakovsky noted that one example of how Google has been building more “understanding” AI is in its searches for veterans, who can look for jobs using their own jargon for expertise, which automatically gets translated into other skills in the way they might be described by employers outside the military.

That’s for sourcing jobs, of course. The key for the tech world, if it wants to build products that will have international staying power to upset the existing “hire”archy (sorry), will be to bring that kind of levelling to every aspect of the recruiting process over time.

Those at the top are not sitting back, either: just yesterday Jobvite (ranked fifth largest ATS tracking platform) announced a funding round of $200 million and three acquisitions.

Feb
07
2019
--

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Feb
07
2019
--

Gong.io nabs $40M investment to enhance CRM with voice recognition

With traditional CRM tools, sales people add basic details about the companies to the database, then a few notes about their interactions. AI has helped automate some of that, but Gong.io wants to take it even further using voice recognition to capture every word of every interaction. Today, it got a $40 million Series B investment.

The round was led by Battery Ventures, with existing investors Norwest Venture Partners, Shlomo Kramer, Wing Venture Capital, NextWorld Capital and Cisco Investments also participating. Battery general partner Dharmesh Thakker will join the startup’s board under the terms of the deal. Today’s investment brings the total raised so far to $68 million, according to the company.

Indeed, $40 million is a hefty Series B, but investors see a tool that has the potential to have a material impact on sales, or at least give management a deeper understanding of why a deal succeeded or failed using artificial intelligence, specifically natural language processing.

Company co-founder and CEO Amit Bendov says the solution starts by monitoring all customer-facing conversation and giving feedback in a fully automated fashion. “Our solution uses AI to extract important bits out of the conversation to provide insights to customer-facing people about how they can get better at what they do, while providing insights to management about how staff is performing,” he explained. It takes it one step further by offering strategic input like how your competitors are trending or how are customers responding to your products.

Screenshot: Gong.io

Bendov says he started the company because he has had this experience at previous startups where he wants to know more about why he lost a sale, but there was no insight from looking at the data in the CRM database. “CRM could tell you what customers you have, how many sales you’re making, who is achieving quota or not, but never give me the information to rationalize and improve operations,” he said.

The company currently has 350 customers, a number that has more than tripled since the end of 2017 when it had 100. He says it’s not only that it’s adding new customers, existing ones are expanding, and he says that there is almost zero churn.

Today, Gong has 120 employees, with headquarters in San Francisco and a 55-person R&D team in Israel. Bendov expects the number of employees to double over the next year with the new influx of money to keep up with the customer growth.

Feb
05
2019
--

Databricks raises $250M at a $2.75B valuation for its analytics platform

Databricks, the company founded by the original team behind the Apache Spark big data analytics engine, today announced that it has raised a $250 million Series E round led by Andreessen Horowitz. Coatue Management, Green Bay Ventures, Microsoft and NEA, also participated in this round, which brings the company’s total funding to $498.5 million. Microsoft’s involvement here is probably a bit of a surprise, but it’s worth noting that it also worked with Databricks on the launch of Azure Databricks as a first-party service on the platform, something that’s still a rarity in the Azure cloud.

As Databricks also today announced, its annual recurring revenue now exceeds $100 million. The company didn’t share whether it’s cash flow-positive at this point, but Databricks CEO and co-founder Ali Ghodsi shared that the company’s valuation is now $2.75 billion.

Current customers, which the company says number around 2,000, include the likes of Nielsen, Hotels.com, Overstock, Bechtel, Shell and HP.

“What Ali and the Databricks team have built is truly phenomenal,” Green Bay Ventures co-founder Anthony Schiller told me. “Their success is a testament to product innovation at the highest level. Databricks is without question best-in-class and their impact on the industry proves it. We were thrilled to participate in this round.”

While Databricks is obviously known for its contributions to Apache Spark, the company itself monetizes that work by offering its Unified Analytics platform on top of it. This platform allows enterprises to build their data pipelines across data storage systems and prepare data sets for data scientists and engineers. To do this, Databricks offers shared notebooks and tools for building, managing and monitoring data pipelines, and then uses that data to build machine learning models, for example. Indeed, training and deploying these models is one of the company’s focus areas these days, which makes sense, given that this is one of the main use cases for big data, after all.

On top of that, Databricks also offers a fully managed service for hosting all of these tools.

“Databricks is the clear winner in the big data platform race,” said Ben Horowitz, co-founder and general partner at Andreessen Horowitz, in today’s announcement. “In addition, they have created a new category atop their world-beating Apache Spark platform called Unified Analytics that is growing even faster. As a result, we are thrilled to invest in this round.”

Ghodsi told me that Horowitz was also instrumental in getting the company to re-focus on growth. The company was already growing fast, of course, but Horowitz asked him why Databricks wasn’t growing faster. Unsurprisingly, given that it’s an enterprise company, that means aggressively hiring a larger sales force — and that’s costly. Hence the company’s need to raise at this point.

As Ghodsi told me, one of the areas the company wants to focus on is the Asia Pacific region, where overall cloud usage is growing fast. The other area the company is focusing on is support for more verticals like mass media and entertainment, federal agencies and fintech firms, which also comes with its own cost, given that the experts there don’t come cheap.

Ghodsi likes to call this “boring AI,” since it’s not as exciting as self-driving cars. In his view, though, the enterprise companies that don’t start using machine learning now will inevitably be left behind in the long run. “If you don’t get there, there’ll be no place for you in the next 20 years,” he said.

Engineering, of course, will also get a chunk of this new funding, with an emphasis on relatively new products like MLFlow and Delta, two tools Databricks recently developed and that make it easier to manage the life cycle of machine learning models and build the necessary data pipelines to feed them.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com