Jun
12
2019
--

RealityEngines.AI raises $5.25M seed round to make ML easier for enterprises

RealityEngines.AI, a research startup that wants to help enterprises make better use of AI, even when they only have incomplete data, today announced that it has raised a $5.25 million seed funding round. The round was led by former Google CEO and Chairman Eric Schmidt and Google founding board member Ram Shriram. Khosla Ventures, Paul Buchheit, Deepchand Nishar, Elad Gil, Keval Desai, Don Burnette and others also participated in this round.

The fact that the service was able to raise from this rather prominent group of investors clearly shows that its overall thesis resonates. The company, which doesn’t have a product yet, tells me that it specifically wants to help enterprises make better use of the smaller and noisier data sets they have and provide them with state-of-the-art machine learning and AI systems that they can quickly take into production. It also aims to provide its customers with systems that can explain their predictions and are free of various forms of bias, something that’s hard to do when the system is essentially a black box.

As RealityEngines CEO Bindu Reddy, who was previously the head of products for Google Apps, told me, the company plans to use the funding to build out its research and development team. The company, after all, is tackling some of the most fundamental and hardest problems in machine learning right now — and that costs money. Some, like working with smaller data sets, already have some available solutions like generative adversarial networks that can augment existing data sets and that RealityEngines expects to innovate on.

Reddy is also betting on reinforcement learning as one of the core machine learning techniques for the platform.

Once it has its product in place, the plan is to make it available as a pay-as-you-go managed service that will make machine learning more accessible to large enterprise, but also to small and medium businesses, which also increasingly need access to these tools to remain competitive.

Jun
11
2019
--

Alyce picks up $11.5 million Series A to help companies give better corporate gifts

Alyce, an AI-powered platform that helps sales people, marketers and event planners give better corporate gifts, has today announced the close of an $11.5 million Series A funding. The round was led by Manifest, with participation from General Catalyst, Boston Seed Capital, Golden Ventures, Morningside and Victress Capital.

According to Alyce, $120 billion is spent each year (just in the United States) on corporate gifts, swag, etc. Unfortunately, the impact of these gifts isn’t usually worth the hassle. No matter how thoughtful or clever a gift is, each recipient is a unique individual with their own preferences and style. It’s nearly impossible for marketers and event planners to find a one-size-fits-all gift for their recipients.

Alyce, however, has a solution. The company asks the admin to upload a list of recipients. The platform then scours the internet for any publicly available information on each individual recipient, combing through their Instagram, Twitter, Facebook, LinkedIn, videos and podcasts in which they appear, etc.

Alyce then matches each individual recipient with their own personalized gift, as chosen from one of the company’s merchant partners. The platform sends out an invitation to that recipient to either accept the gift, exchange the gift for something else on the platform, or donate the dollar value to the charity of their choice.

This allows Alyce to ensure marketers and sales people always give the right gift, even when they don’t. For charity donations, the donation is made in the name of the corporate entity who gave the gift, not the recipient, meaning that all donations act as a write-off for the gifting company.

The best marketers and sales people know how impactful a great gift, at the right time, can be. But the work involved in figuring out what a person actually wants to receive can be overwhelming. Hell, I struggle to find the right gifts for my close friends and loved ones.

Alyce takes all the heavy lifting out of the equation.

The company also has integrations with Salesforce, so users can send an Alyce gift from directly within Salesforce.

Alyce charges a subscription to businesses who use the software, and also takes a small cut of gifts accepted on the platform. The company also offers to send physical boxes with cards and information about the gift as another revenue channel.

Alyce founder and CEO Greg Segall says the company is growing 30 percent month-over-month and has clients such as InVision, Lenovo, Marketo and Verizon.

Jun
10
2019
--

Qubole launches Quantum, its serverless database engine

Qubole, the data platform founded by Apache Hive creator and former head of Facebook’s Data Infrastructure team Ashish Thusoo, today announced the launch of Quantum, its first serverless offering.

Qubole may not necessarily be a household name, but its customers include the likes of Autodesk, Comcast, Lyft, Nextdoor and Zillow . For these users, Qubole has long offered a self-service platform that allowed their data scientists and engineers to build their AI, machine learning and analytics workflows on the public cloud of their choice. The platform sits on top of open-source technologies like Apache Spark, Presto and Kafka, for example.

Typically, enterprises have to provision a considerable amount of resources to give these platforms the resources they need. These resources often go unused and the infrastructure can quickly become complex.

Qubole already abstracts most of this away, offering what is essentially a serverless platform. With Quantum, however, it is going a step further by launching a high-performance serverless SQL engine that allows users to query petabytes of data with nothing else but ANSI-SQL, giving them the choice between using a Presto cluster or a serverless SQL engine to run their queries, for example.

The data can be stored on AWS and users won’t have to set up a second data lake or move their data to another platform to use the SQL engine. Quantum automatically scales up or down as needed, of course, and users can still work with the same metastore for their data, no matter whether they choose the clustered or serverless option. Indeed, Quantum is essentially just another SQL engine without Qubole’s overall suite of engines.

Typically, Qubole charges enterprises by compute minutes. When using Quantum, the company uses the same metric, but enterprises pay for the execution time of the query. “So instead of the Qubole compute units being associated with the number of minutes the cluster was up and running, it is associated with the Qubole compute units consumed by that particular query or that particular workload, which is even more fine-grained,” Thusoo explained. “This works really well when you have to do interactive workloads.”

Thusoo notes that Quantum is targeted at analysts who often need to perform interactive queries on data stored in object stores. Qubole integrates with services like Tableau and Looker (which Google is now in the process of acquiring). “They suddenly get access to very elastic compute capacity, but they are able to come through a very familiar user interface,” Thusoo noted.

 

Jun
10
2019
--

Microsoft Power Platform update aims to put AI in reach of business users

Low code and no code are the latest industry buzzwords, but if vendors can truly abstract away the complexity of difficult tasks like building machine learning models, it could help mainstream technologies that are currently out of reach of most business users. That’s precisely what Microsoft is aiming to do with its latest Power Platform announcements today.

The company tried to bring that low-code simplicity to building applications last year when it announced PowerApps. Now it believes by combining PowerApps with Microsoft Flow and its new AI Builder tool, it can allow folks building apps with PowerApps to add a layer of intelligence very quickly.

It starts with having access to data sources, and the Data Connector tool gives users access to more than 250 data connectors. That includes Salesforce, Oracle and Adobe, as well as, of course, Microsoft services like Office 365 and Dynamics 365. Richard Riley, senior director for Power Platform marketing, says this is the foundation for pulling data into AI Builder.

“AI Builder is all about making it just as easy in a low-code, no-code way to go bring artificial intelligence and machine learning into your Power Apps, into Microsoft Flow, into the Common Data Service, into your data connectors, and so on,” Riley told TechCrunch.

Screenshot: Microsoft

Charles Lamanna, general manager at Microsoft, says that Microsoft can do all the analysis and heavy lifting required to build a data model for you, removing a huge barrier to entry for business users. “The basic idea is that you can select any field in the Common Data Service and just say, ‘I want to predict this field.’ Then we’ll actually go look at historical records for that same table or entity to go predict [the results],” he explained. This could be used to predict if a customer will sign up for a credit card, if a customer is likely to churn, or if a loan would be approved, and so forth.

This announcement comes the same day that Salesforce announced it was buying Tableau for almost $16 billion, and days after Google bought Looker for $2.6 billion, and shows how powerful data can be in a business context, especially when providing a way to put that data to use, whether in the form of visualization or inside business applications.

While Microsoft admits AI Builder won’t be something everyone uses, they do see a kind of power user who might have been previously unable to approach this level of sophistication on their own, building apps and adding layers of intelligence without a heck of a lot of coding. If it works as advertised it will bring a level of simplicity to tasks that were previously well out of reach of business users without requiring a data scientist. Regardless, all of this activity shows data has become central to business, and vendors are going to build or buy to put it to work.

Jun
10
2019
--

Vectra lands $100M Series E investment for AI-driven network security

Vectra, a seven-year-old company that helps customers detect intrusions at the network level, whether in the cloud or on premises, announced a $100 million Series E funding round today led by TCV. Existing investors, including Khosla Ventures and Accel, also participated in the round, which brings the total raised to more than $200 million, according to the company.

As company CEO Hitesh Sheth explained, there are two primary types of intrusion detection. The first is end point detection and the second is his company’s area of coverage, network detection and response, or NDR.  He says that by adding a layer of artificial intelligence, it improves the overall results.

“One of the keys to our success has been applying AI to network traffic, the networking side of NDR, to look for the signal in the noise. And we can do this across the entire infrastructure, from the data center to the cloud all the way into end user traffic including IoT,” he explained.

He said that as companies move their data to the cloud, they are looking for ways to ensure the security of their most valuable data assets, and he says his company’s NDR solution can provide that. In fact, securing the cloud side of the equation is one of the primary investment focuses for this round.

Tim McAdam, from lead investor TCV, says that the AI piece is a real differentiator for Vectra and one that attracted his firm to invest in the company. He said that while he realized that AI is an overused term these days, after talking to 30 customers he heard over and over again that Vectra’s AI-driven solution was a differentiator over competing products. “All of them have decided to standardize on the Vectra Cognito because to a person, they spoke of the efficacy and the reduction of their threat vectors as a result of standardizing on Vectra,” McAdam told TechCrunch.

The company was founded in 2012 and currently has 240 employees. That is expected to double in a year to 18 months with this funding.

Jun
10
2019
--

Salesforce is buying data visualization company Tableau for $15.7B in all-stock deal

On the heels of Google buying analytics startup Looker last week for $2.6 billion, Salesforce today announced a huge piece of news in a bid to step up its own work in data visualization and (more generally) tools to help enterprises make sense of the sea of data that they use and amass: Salesforce is buying Tableau for $15.7 billion in an all-stock deal.

The latter is publicly traded and this deal will involve shares of Tableau Class A and Class B common stock getting exchanged for 1.103 shares of Salesforce common stock, the company said, and so the $15.7 billion figure is the enterprise value of the transaction, based on the average price of Salesforce’s shares as of June 7, 2019.

This is a huge jump on Tableau’s last market cap: it was valued at $10.79 billion at close of trading Friday, according to figures on Google Finance. (Also: trading has halted on its stock in light of this news.)

The two boards have already approved the deal, Salesforce notes. The two companies’ management teams will be hosting a conference call at 8am Eastern and I’ll listen in to that as well to get more details.

This is a huge deal for Salesforce as it continues to diversify beyond CRM software and into deeper layers of analytics.

The company reportedly worked hard to — but ultimately missed out on — buying LinkedIn (which Microsoft picked up instead), and while there isn’t a whole lot in common between LinkedIn and Tableau, this deal will also help Salesforce extend its engagement (and data intelligence) for the customers that Salesforce already has — something that LinkedIn would have also helped it to do.

This also looks like a move designed to help bulk up against Google’s move to buy Looker, announced last week, although I’d argue that analytics is a big enough area that all major tech companies that are courting enterprises are getting their ducks in a row in terms of squaring up to stronger strategies (and products) in this area. It’s unclear whether (and if) the two deals were made in response to each other, although it seems that Salesforce has been eyeing up Tableau for years.

“We are bringing together the world’s #1 CRM with the #1 analytics platform. Tableau helps people see and understand data, and Salesforce helps people engage and understand customers. It’s truly the best of both worlds for our customers–bringing together two critical platforms that every customer needs to understand their world,” said Marc Benioff, chairman and co-CEO, Salesforce, in a statement. “I’m thrilled to welcome Adam and his team to Salesforce.”

Tableau has about 86,000 business customers, including Charles Schwab, Verizon (which owns TC), Schneider Electric, Southwest and Netflix. Salesforce said Tableau will operate independently and under its own brand post-acquisition. It will also remain headquartered in Seattle, Wash., headed by CEO Adam Selipsky along with others on the current leadership team.

Indeed, later during the call, Benioff let it drop that Seattle would become Salesforce’s official second headquarters with the closing of this deal.

That’s not to say, though, that the two will not be working together.

On the contrary, Salesforce is already talking up the possibilities of expanding what the company is already doing with its Einstein platform (launched back in 2016, Einstein is the home of all of Salesforce’s AI-based initiatives); and with “Customer 360,” which is the company’s product and take on omnichannel sales and marketing. The latter is an obvious and complementary product home, given that one huge aspect of Tableau’s service is to provide “big picture” insights.

“Joining forces with Salesforce will enhance our ability to help people everywhere see and understand data,” said Selipsky. “As part of the world’s #1 CRM company, Tableau’s intuitive and powerful analytics will enable millions more people to discover actionable insights across their entire organizations. I’m delighted that our companies share very similar cultures and a relentless focus on customer success. I look forward to working together in support of our customers and communities.”

“Salesforce’s incredible success has always been based on anticipating the needs of our customers and providing them the solutions they need to grow their businesses,” said Keith Block, co-CEO, Salesforce. “Data is the foundation of every digital transformation, and the addition of Tableau will accelerate our ability to deliver customer success by enabling a truly unified and powerful view across all of a customer’s data.”

Jun
05
2019
--

Microsoft and Oracle link up their clouds

Microsoft and Oracle announced a new alliance today that will see the two companies directly connect their clouds over a direct network connection so that their users can then move workloads and data seamlessly between the two. This alliance goes a bit beyond just basic direct connectivity and also includes identity interoperability.

This kind of alliance is relatively unusual between what are essentially competing clouds, but while Oracle wants to be seen as a major player in this space, it also realizes that it isn’t likely to get to the size of an AWS, Azure or Google Cloud anytime soon. For Oracle, this alliance means that its users can run services like the Oracle E-Business Suite and Oracle JD Edwards on Azure while still using an Oracle database in the Oracle cloud, for example. With that, Microsoft still gets to run the workloads and Oracle gets to do what it does best (though Azure users will also continue be able to run their Oracle databases in the Azure cloud, too).

“The Oracle Cloud offers a complete suite of integrated applications for sales, service, marketing, human resources, finance, supply chain and manufacturing, plus highly automated and secure Generation 2 infrastructure featuring the Oracle Autonomous Database,” said Don Johnson, executive vice president, Oracle Cloud Infrastructure (OCI), in today’s announcement. “Oracle and Microsoft have served enterprise customer needs for decades. With this alliance, our joint customers can migrate their entire set of existing applications to the cloud without having to re-architect anything, preserving the large investments they have already made.”

For now, the direct interconnect between the two clouds is limited to Azure US East and Oracle’s Ashburn data center. The two companies plan to expand this alliance to other regions in the future, though they remain mum on the details. It’ll support applications like JD Edwards EnterpriseOne, E-Business Suite, PeopleSoft, Oracle Retail and Hyperion on Azure, in combination with Oracle databases like RAC, Exadata and the Oracle Autonomous Database running in the Oracle Cloud.

“As the cloud of choice for the enterprise, with over 95% of the Fortune 500 using Azure, we have always been first and foremost focused on helping our customers thrive on their digital transformation journeys,” said Scott Guthrie, executive vice president of Microsoft’s Cloud and AI division. “With Oracle’s enterprise expertise, this alliance is a natural choice for us as we help our joint customers accelerate the migration of enterprise applications and databases to the public cloud.”

Today’s announcement also fits within a wider trend at Microsoft, which has recently started building a number of alliances with other large enterprise players, including its open data alliance with SAP and Adobe, as well as a somewhat unorthodox gaming partnership with Sony.

 

May
20
2019
--

IDC: Asia-Pacific spending on AI systems will reach $5.5 billion this year, up 80% from 2018

Spending on artificial intelligence systems in the Asia-Pacific region is expected to reach $5.5 billion this year, an almost 80% increase over 2018, driven by businesses in China and the retail industry, according to IDC. In a new report, the research firm also said it expects AI spending to climb at a compound annual growth rate of 50% from 2018 to 2022, reaching a total of $15.06 billion in 2022.

This means AI spending growth in the Asia-Pacific region is expected to outpace the rest of the world over the next three years. In March, IDC forecast that worldwide spending on AI systems is expected to grow at a CAGR of 38% between 2018 to 2022.

Most of the growth will happen in China, which IDC says will account for nearly two-thirds of AI spending in the region, excluding Japan, in all forecast years. Spending on AI systems will be driven by retail, professional services and government industries.

Retail demand for AI-based tools will also lead growth in the rest of the region, as companies begin to rely on it more for merchandising, product recommendations, automated customer service and supply and logistics. While the banking industry’s AI spending trails behind retail, it will also begin adopting the tech for fraud analysis, program advisors, recommendations and customer service. IDC forecasts that this year, companies will invest almost $700 million in automated service agents. The next largest area for investment is sales process recommendations and automation, with $450 million expected, and intelligent process automation at more than $350 million.

The fastest-growing industries for AI spending are expected to be healthcare (growing at 60.2% CAGR) and process manufacturing (60.1% CAGR). In terms of infrastructure, IDC says spending on hardware, including servers and storage, will reach almost $7 billion in 2019, while spending on software is expected to grow at a five-year CAGR of 80%.

May
17
2019
--

Under the hood on Zoom’s IPO, with founder and CEO Eric Yuan

Extra Crunch offers members the opportunity to tune into conference calls led and moderated by the TechCrunch writers you read every day. This week, TechCrunch’s Kate Clark sat down with Eric Yuan, the founder and CEO of video communications startup Zoom, to go behind the curtain on the company’s recent IPO process and its path to the public markets.

Since hitting the trading desks just a few weeks ago, Zoom stock is up over 30%. But the Zoom’s path to becoming a Silicon Valley and Wall Street darling was anything but easy. Eric tells Kate how the company’s early focus on profitability, which is now helping drive the stock’s strong performance out of the gate, actually made it difficult to get VC money early on, and the company’s consistent focus on user experience led to organic growth across different customer bases.

Eric: I experienced the year 2000 dot com crash and the 2008 financial crisis, and it almost wiped out the company. I only got seed money from my friends, and also one or two VCs like AME Cloud Ventures and Qualcomm Ventures.

nd all other institutional VCs had no interest to invest in us. I was very paranoid and always thought “wow, we are not going to survive next week because we cannot raise the capital. And on the way, I thought we have to look into our own destiny. We wanted to be cash flow positive. We wanted to be profitable.

nd so by doing that, people thought I wasn’t as wise, because we’d probably be sacrificing growth, right? And a lot of other companies, they did very well and were not profitable because they focused on growth. And in the future they could be very, very profitable.

Eric and Kate also dive deeper into Zoom’s founding and Eric’s initial decision to leave WebEx to work on a better video communication solution. Eric also offers his take on what the future of video conferencing may look like in the next five to 10 years and gives advice to founders looking to build the next great company.

For access to the full transcription and the call audio, and for the opportunity to participate in future conference calls, become a member of Extra Crunch. Learn more and try it for free. 

Kate Clark: Well thanks for joining us Eric.

Eric Yuan: No problem, no problem.

Kate: Super excited to chat about Zoom’s historic IPO. Before we jump into questions, I’m just going to review some of the key events leading up to the IPO, just to give some context to any of the listeners on the call.

May
17
2019
--

Health at Scale lands $16M Series A to bring machine learning to healthcare

Health at Scale, a startup with founders who have both medical and engineering expertise, wants to bring machine learning to bear on healthcare treatment options to produce outcomes with better results and less aftercare. Today the company announced a $16 million Series A. Optum, which is part of the UnitedHealth Group, was the sole investor.

Today, when people look at treatment options, they may look at a particular surgeon or hospital, or simply what the insurance company will cover, but they typically lack the data to make truly informed decisions. This is true across every part of the healthcare system, particularly in the U.S. The company believes using machine learning, it can produce better results.

“We are a machine learning shop, and we focus on what I would describe as precision delivery. So in other words, we look at this question of how do we match patients to the right treatments, by the right providers, at the right time,” Zeeshan Syed, Health at Scale CEO told TechCrunch.

The founders see the current system as fundamentally flawed, and while they see their customers as insurance companies, hospital systems and self-insured employers, they say the tools they are putting into the system should help everyone in the loop get a better outcome.

The idea is to make treatment decisions more data-driven. While they aren’t sharing their data sources, they say they have information, from patients with a given condition, to doctors who treat that condition, to facilities where the treatment happens. By looking at a patient’s individual treatment needs and medical history, they believe they can do a better job of matching that person to the best doctor and hospital for the job. They say this will result in the fewest post-operative treatment requirements, whether that involves trips to the emergency room or time in a skilled nursing facility, all of which would end up adding significant additional cost.

If you’re thinking this is strictly about cost savings for these large institutions, Mohammed Saeed, who is the company’s chief medical officer and has an MD from Harvard and a PhD in electrical engineering from MIT, insists that isn’t the case. “From our perspective, it’s a win-win situation since we provide the best recommendations that have the patient interest at heart, but from a payer or provider perspective, when you have lower complication rates you have better outcomes and you lower your total cost of care long term,” he said.

The company says the solution is being used by large hospital systems and insurer customers, although it couldn’t share any. The founders also said it has studied the outcomes after using its software and the machine learning models have produced better outcomes, although it couldn’t provide the data to back that up at that point at this time.

The company was founded in 2015 and currently has 11 employees. It plans to use today’s funding to build out sales and marketing to bring the solution to a wider customer set.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com