Mar
19
2019
--

The top 10 startups from Y Combinator W19 Demo Day 1

Electric-vehicle chargers, heads-up displays for soldiers and the Costco of weed were some of our favorites from prestigious startup accelerator Y Combinator’s Winter 2019 Demo Day 1. If you want to take the pulse of Silicon Valley, YC is the place to be. But with more than 200 startups presenting across two stages and two days, it’s tough to keep track.

You can check out our write-ups of all 85 startups that launched on Demo Day 1, and come back later for our full index and picks from Day 2. But now, based on feedback from top investors and TechCrunch’s team, here’s our selection of the top 10 companies from the first half of this Y Combinator batch, and why we picked each.

Ravn

Looking around corners is one of the most dangerous parts of war for infantry. Ravn builds heads-up displays that let soldiers and law enforcement see around corners thanks to cameras on their gun, drones or elsewhere. The ability to see the enemy while still being behind cover saves lives, and Ravn already has $490,000 in Navy and Air Force contracts. With a CEO who was a Navy Seal who went on to study computer science, plus experts in augmented reality and selling hardware to the Department of Defense, Ravn could deliver the inevitable future of soldier heads-up displays.

Why we picked Ravn: The AR battlefield is inevitable, but right now Microsoft’s HoloLens team is focused on providing mid-fight information, like how many bullets a soldier has in their clip and where their squad mates are. Ravn’s tech was built by a guy who watched the tragic consequences of getting into those shootouts. He wants to help soldiers avoid or win these battles before they get dangerous, and his team includes an expert in selling hardened tech to the U.S. government.

Middesk

It’s difficult to know if a business’ partners have paid their taxes, filed for bankruptcy or are involved in lawsuits. That leads businesses to write off $120 billion a year in uncollectable bad debt. Middesk does due diligence to sort out good businesses from the bad to provide assurance for B2B deals, loans, investments, acquisitions and more. By giving clients the confidence that they’ll be paid, Middesk could insert itself into a wide array of transactions.

Why we picked Middesk: It’s building the trust layer for the business world that could weave its way into practically every deal. More data means making fewer stupid decisions, and Middesk could put an end to putting faith in questionable partners.

Convictional

Convictional helps direct-to-consumer companies approach larger retailers more simply. It takes a lot of time for a supplier to build a relationship with a retailer and start selling their products. Convictional wants to speed things up by building a B2B self-service commerce platform that allows retailers to easily approach brands and make orders.

Why we picked Convictional: There’s been an explosion of D2C businesses selling everything from suitcases to shaving kits. But to drive exposure and scale, they need retail partners who’re eager not to be cut out of this growing commerce segment. Playing middleman could put Convictional in a lucrative position, while also making it a nexus of valuable shopping data.

Dyneti Technologies

Dyneti has invented a credit card scanner SDK that uses a smartphone’s camera to help prevent fraud by more than 50 percent and improve conversion for businesses by 5 percent. The business was started by a pair of former Uber employees, including CEO Julia Zheng, who launched the fraud analytics teams for Account Security and UberEATS. Dyneti’s service is powered by deep learning and works on any card format. In the two months since it launched, the company has signed contracts with Rappi, Gametime and others.

Why we picked Dyneti: Cybersecurity threats are growing and evolving, yet underequipped businesses are eager to do more business online. Dyneti is one of those fundamental B2B businesses that feels like Stripe — capable of bringing simplicity and trust to a complex problem so companies can focus on their product.

AmpUp

The “Airbnb for electric-vehicle chargers,” ampUp is preparing for a world in which the majority of us drive EVs — it operates a mobile app that connects a network of thousands of EV chargers and drivers. Using the app, an electric-vehicle owner can quickly identify an available and compatible charger, and EV charger owners can earn cash sharing their charger at their own price and their own schedule. The service is currently live in the Bay Area.

Why we picked ampUp: Electric vehicles are inevitable, but reliable charging is one of the leading fears dissuading people from buying. Rather than build out some massive owned network of chargers that will never match the distributed gas station network, ampUp could put an EV charger anywhere there’s someone looking to make a few bucks.

Flockjay

Flockjay operates an online sales academy that teaches job seekers from underrepresented backgrounds the skills and training they need to pursue a career in tech sales. The 12-week bootcamp offers trainees coaching and mentorship. The company has launched its debut cohort with 17 students, 100 percent of whom are already in job interviews and 40 percent of whom have already secured new careers in the tech industry.

Why we picked Flockjay: Unlike coding bootcamps that can require intense prerequisites, killer salespeople can be molded from anyone with hustle. Those from underrepresented backgrounds already know how to expertly sell themselves to attain opportunities others take for granted. Flockjay could provide economic mobility at a crucial juncture when job security is shaky.

Deel

Twenty million international contractors work with U.S. companies, but it’s difficult to onboard and train them. Deel handles the contracts, payments and taxes in one interface to eliminate paperwork and wasted time. Deel charges businesses $10 per contractor per month and a 1 percent fee on payouts, which earns it an average of $560 per contractor per year.

Why we picked Deel: The destigmatization of remote work is opening new recruiting opportunities abroad for U.S. businesses. But unless teams can properly integrate these distant staffers, the cost savings of hiring overseas are negated. As the globalization megatrend continues, businesses will need better HR tools.

Glide

There has been a pretty major trend toward services that make it easier to build web pages or mobile apps. Glide lets customers easily create well-designed mobile apps from Google Sheets pages. This not only makes it easy to build the pages, but simplifies the skills needed to keep information updated on the site.

Why we picked Glide: While desktop website makers is a brutally competitive market, it’s still not easy to make a mobile site if you’re not a coder. Rather than starting from a visual layout tool with which many people would still be unfamiliar, Glide starts with a spreadsheet that almost everyone has used. And as the web begins to feel less personal with all the brands and influencers, Glide could help people make bespoke apps that put intimacy and personality first.

Docucharm

The platform, co-founded by former Uber product manager Minh Tri Pham, turns documents into structured data a computer can understand to accurately automate document processing workflows and take away the need for human data entry. Docucharm’s API can understand various forms of documents (like paystubs, for example) and will extract the necessary information without error. Its customers include tax prep company Tributi and lending business Aspire.

Why we picked Docucharm: Paying high-priced, high-skilled workers to do data entry is a huge waste. And optical character recognition like Docucharm’s will unlock new types of businesses based on data extraction. This startup could be the AI layer underneath it all.

Flower Co

Flower Co provides memberships for cheaper weed sales and delivery. Most dispensaries cater to high-end customers and newbies that want expensive products and tons of hand-holding. In contrast, Flower Co caters to long-time marijuana enthusiasts who want huge quantities at low prices. They’re currently selling $200.000 in marijuana per month to 700 members. They charge $100 a year for membership, and take 10 percent on product sales.

Why we picked Flower Co: Marijuana is the next gold rush, a once-in-a-generation land-grab opportunity. Yet most marijuana merchants have focused on hyper-discerning high-end customers despite the long-standing popularity of smoking big blunts of cheap weed with a bunch of friends. For those who want to make cannabis consumption a lifestyle, and there will be plenty, Flower Co could become their wholesaler.

Honorable Mentions

Atomic Alchemy – Filling the shortage of nuclear medicine

Yourchoice – Omni-gender non-hormonal birth control

Prometheus – Turning CO2 into gas

Lumos – Medical search engine for doctors

Heart Aerospace – Regional electric planes

Boundary Layer Technologies – Super-fast container ships

Additional reporting by Kate Clark, Greg Kumparak and Lucas Matney

Mar
19
2019
--

Vonage brings number programmability to its business service

Chances are you still mostly think of Vonage as a consumer VOIP player, but in recent years, the company also launched its Vonage Business Cloud (VBC) platform and acquired Nexmo, an API-based communications service that competes directly with many of Twilio’s core services. Today, Vonage is bringing its VBC service and Nexmo a bit closer with the launch of number programmability for its business customers.

What this means is that enterprises can now take any VBC number and extend it with the help of Nexmo’s APIs. To enable this, all they have to do is toggle a switch in their management console and then they’ll be able to programmatically route calls, create custom communications apps and workflows, and integrate third-party systems to build chatbots and other tools.

“About four years ago we made a pretty strong pivot to going from residential — a lot of people know Vonage as a residential player — to the business side,” Vonage senior VP of product management Jay Patel told me. “And through a series of acquisitions [including Nexmo], we’ve kind of built what we think is a very unique offering.” In many ways, those different platforms were always separated from each other, though. With all of the pieces in place now, however, the team started thinking about how it could use the Nexmo APIs to allow its customers in the unified communications and contact center space to more easily customize these services for them.

About a year ago, the team started working on this new functionality that brings the programmability of Nexmo to VBC. “We realized it doesn’t make sense for us to create our own new sets of APIs on our unified communications and contact center space,” said Patel. “Why don’t we use the APIs that Nexmo has already built?”

As Patel also stressed, the phone number is still very much linked to a business or individual employee — and they don’t want to change that just for the sake of having a programmable service. By turning on programmability for these existing numbers, though, and leveraging the existing Nexmo developer ecosystem and the building blocks those users have already created, the company believes that it’s able to offer a differentiated service that allows users to stay on its platform instead of having to forward a call to a third-party service like Twilio, for example, to enable similar capabilities.

In terms of those capabilities, users can pretty much do anything they want with these calls — and that’s important because every company has different processes and requirements. Maybe that’s logging info into multiple CRM systems in parallel or taking a clip of a call and pushing it into a different system for training purposes. Or you could have the system check your calendar when there are incoming calls and then, if it turns out you are in a meeting, offer the caller a callback whenever your calendar says you’re available again. All of that should only take a few lines of code or, if you want to avoid most of the coding, a few clicks in the company’s GUI for building these flows.

Vonage believes that these new capabilities will attract quite a few new customers. “It’s our value-add when we’re selling to new customers,” he said. “They’re looking for this kind of capability or are running into brick walls. We see a lot of companies that have an idea but they don’t know how to do it. They’re not engineers or they don’t have a big staff of developers, but because of the way we’ve implemented this, it brings the barrier of entry to create these solutions much lower than if you had a legacy system on-prem where you had to be a C++ developer to build an app.

Mar
18
2019
--

Atlassian acquires AgileCraft for $166M

Atlassian today announced that it has acquired AgileCraft, a service that aims to give enterprises plan their strategic projects and workstreams. The service provides business leaders with additional insights into the current status of technical projects and gives them insights into the bottlenecks, risks and dependencies of these projects. Indeed, the focus of AgileCraft is less on technical teams than on the business teams that support them and help them manage the digital transformation of their businesses.

The price total of the acquisition is about $166 million, with $154 million in cash and the remainder in restricted shares.

“Many leaders are still making mission-critical decisions using their instincts and best guesses instead of data,” said Scott Farquhar, Atlassian’s co-founder and co-CEO, in today’s announcement. “As Atlassian tools spread through organizations, technology leaders need better visibility into work performed by their teams. With AgileCraft joining Atlassian, we believe we’re the best company to help executives align the work across their organization – providing an all-encompassing view that connects strategy, work, and outcomes.”

As the name implies, AgileCraft focuses on the Agile methodology, though it also offers a bit of flexibility there with support for frameworks like SAFe, LeSS, Spotify. It supports pulling in data from tools like Atlassian’s Jira, but also Microsoft’s Team Foundation Server, IBM’s RTC and other services.

Atlassian will continue to operate AgileCraft, which had raised about $10.1 million before the acquisition. as a standalone service. “We will continue to focus relentlessly on our customers’ success,” writes AgileCraft’s founder and CEO Steve Elliott. “We remain dedicated to pioneering enterprise agility and are thrilled to team up with the outstanding people at Atlassian to help our customers thrive.”

Over the years, Atlassian started embracing users and use cases for its tools that go beyond its core tools for development tools. Jira and Confluence are the prime examples for this. Today’s acquisition continues this trend in that AgileCraft aims to bring many of the methodologies that tech teams use to the rest of the company.

“One of the critical roles we play for lots of organizations is in helping drive this kind of digital transformation where we’re really empowering the teams that are building and developing the kind of technology that moves our customers forward,” Atlassian president Jay Simons told me. “AgileCraft basically complements all of that by extending visibility into what teams are using Atlassian products to do up into key stakeholders and leaders in the business that are trying to manage better visibility at a portfolio or program level.”

Simons also stressed that AgileCraft already has very strong integrations into the existing Atlassian tools — and indeed, that was one of the main drivers of the acquisition. He noted that the company plans to improve those and think about additional patterns. “We’ll continue doing what we’re doing,” he said.

Simons also noted that he expects that a lot of Jira customers will now look at AgileCraft as an additional tool in helping the businesses manage their business’s digital transformation.

Atlassian doesn’t typically make a lot of acquisitions. Its pace is close to about one major buy per year. Last year, the company picked up OpsGenie for $295 million. In 2017, it acquired Trello for $425 million, the company’s biggest acquisition to date. Other major products the company acquired include StatusPage, BlueJimp, HipChat and Bitbucket (all the way back in 2010).

Mar
14
2019
--

ProdPerfect gets $2.6 million to automate QA testing for web apps

ProdPerfect, a Boston-based startup focused on automating QA testing for web apps, has announced the close of a $2.6 million Seed round co-led by Eniac Ventures and Fika Ventures, with participation from Entrepreneurs Roundtable Accelerator.

ProdPerfect started when co-founder and CEO Dan Widing was VP of engineering at WeSpire, where he saw firsthand the pain points associated with web application QA testing. Whereas there were all kinds of product analytics tools for product engineers, the same data wasn’t there for the engineers building QA tests that are meant to replicate user behavior.

He imagined a platform that would use live data around real user behavior to formulate these QA tests. That’s how ProdPerfect was born. The platform sees user behavior, builds and delivers test scripts to the engineering team.

The service continues to build on what it knows about a product, and can then simulate new tests when new features are added based on aggregated flows of common user behavior. This data doesn’t track any information about the user, but rather anonymizes them and watches how they move through the web app. The hope is that ProdPerfect gives engineers the opportunity to keep building the product instead of spreading their resources across building a QA testing suite.

The new funding will go toward expanding the sales team and further building out the product. For now, ProdPerfect simply offers functional testing, which uses a single virtual user to test whether a product breaks or not. But president and co-founder Erik Fogg sees an opportunity to build more integrated testing, including performance, security and localization testing.

Fogg says the company is growing 40 percent month over month in booked revenue.

The company says it can deploy within two weeks of installing a data tracker, and provide more than 70 percent coverage of all user interactions with 95 percent+ test stability.

“The greatest challenge is going to be finding people who share our company’s core values and are of high enough talent, ambition and autonomy in part because our hiring road map is so steep,” said Fogg. “Growing pains catch up with businesses as a team expands quickly and we have to make sure that we’re picky and that we reinforce the values we have.”

Mar
08
2019
--

Okta to acquire workflow automation startup Azuqua for $52.5M

During its earnings report yesterday afternoon, Okta announced it intends to acquire Azuqua, a Seattle, Wash. workflow automation startup, for $52.5 million.

In a blog post announcing the news, Okta co-founder and COO Frederic Kerrest saw the combining of the two companies as a way to move smoothly between applications in a complex workflow without having to constantly present your credentials.

“With Okta and Azuqua, IT teams will be able to use pre-built connectors and logic to create streamlined identity processes and increase operational speed. And, product teams will be able to embed this technology in their own applications alongside Okta’s core authentication and user management technology to build…integrated customer experiences,” Kerrest wrote.

In a modern enterprise, people and work are constantly shifting and moving between applications and services and combining automation software with identity and access management could offer a seamless way to move between them.

This represents Okta’s largest acquisition to-date and follows Stormpath almost exactly two years ago and ScaleFT last July. Taken together, you can see a company that is trying to become a more comprehensive identity platform.

Azuqua, which has raised $16 million since it launched in 2013, appears to have given investors a pretty decent return. When the deal closes, Okta intends to move the Azuqua team to its Bellevue offices, increasing its presence in the Northwest. Okta’s headquarters are in San Francisco. Azuqua customers include Airbnb, McDonald’s, VMware and HubSpot,

Okta was founded in 2009 and raised over $229 million before going public April, 2017.

Mar
04
2019
--

Can predictive analytics be made safe for humans?

Massive-scale predictive analytics is a relatively new phenomenon, one that challenges both decades of law as well as consumer thinking about privacy.

As a technology, it may well save thousands of lives in applications like predictive medicine, but if it isn’t used carefully, it may prevent thousands from getting loans, for instance, if an underwriting algorithm is biased against certain users.

I chatted with Dennis Hirsch a few weeks ago about the challenges posed by this new data economy. Hirsch is a professor of law at Ohio State and head of its Program on Data and Governance. He’s also affiliated with the university’s Risk Institute.

“Data ethics is the new form of risk mitigation for the algorithmic economy,” he said. In a post-Cambridge Analytica world, every company has to assess what data it has on its customers and mitigate the risk of harm. How to do that, though, is at the cutting edge of the new field of data governance, which investigates the processes and policies through which organizations manage their data.

You’re reading the Extra Crunch Daily. Like this newsletter? Subscribe for free to follow all of our discussions and debates.

“Traditional privacy regulation asks whether you gave someone notice and given them a choice,” he explains. That principle is the bedrock for Europe’s GDPR law, and for the patchwork of laws in the U.S. that protect privacy. It’s based around the simplistic idea that a datum — such as a customer’s address — shouldn’t be shared with, say, a marketer without that user’s knowledge. Privacy is about protecting the address book, so to speak.

The rise of “predictive analytics” though has completely demolished such privacy legislation. Predictive analytics is a fuzzy term, but essentially means interpreting raw data and drawing new conclusions through inference. This is the story of the famous Target data crisis, where the retailer recommended pregnancy-related goods to women who had certain patterns of purchases. As Charles Duhigg explained at the time:

Many shoppers purchase soap and cotton balls, but when someone suddenly starts buying lots of scent-free soap and extra-big bags of cotton balls, in addition to hand sanitizers and washcloths, it signals they could be getting close to their delivery date.

Predictive analytics is difficult to predict. Hirsch says “I don’t think any of us are going to be intelligent enough to understand predictive analytics.” Talking about customers, he said “They give up their surface items — like cotton balls and unscented body lotion — they know they are sharing that, but they don’t know they are giving up their pregnancy status. … People are not going to know how to protect themselves because they can’t know what can be inferred from their surface data.”

In other words, the scale of those predictions completely undermines notice and consent.

Even though the law hasn’t caught up to this exponentially more challenging problem, companies themselves seem to be responding in the wake of Target and Facebook’s very public scandals. “What we are hearing is that we don’t want to put our customers at risk,” Hirsch explained. “They understand that this predictive technology gives them really awesome power and they can do a lot of good with it, but they can also hurt people with it.” The key actors here are corporate chief privacy officers, a role that has cropped up in recent years to mitigate some of these challenges.

Hirsch is spending significant time trying to build new governance strategies to allow companies to use predictive analytics in an ethical way, so that “we can achieve and enjoy its benefits without having to bear these costs from it.” He’s focused on four areas: privacy, manipulation, bias, and procedural unfairness. “We are going to set out principles on what is ethical and and what is not,” he said.

Much of that focus has been on how to help regulators build policies that can manage predictive analytics. Since people can’t understand the extent that inferences can be made with their data, “I think a much better regulatory approach is to have someone who does understand, ideally some sort of regulator, who can draw some lines.” Hirsch has been researching how the FTC’s Unfairness Authority may be a path forward for getting such policies into practice.

He analogized this to the Food and Drug Administration. “We have no ability to assess the risks of a given drug [so] we give it to an expert agency and allow them to assess it,” he said. “That’s the kind of regulation that we need.”

Hirsch overall has a balanced perspective on the risks and rewards here. He wants analytics to be “more socially acceptable” but at the same time, sees the needs for careful scrutiny and oversight to ensure that consumers are protected. Ultimately, he sees that as incredibly beneficial to companies who can take the value out of this tech without risking provoking consumer ire.

Who will steal your data more: China or America?

The Huawei logo is seen in the center of Warsaw, Poland

Jaap Arriens/NurPhoto via Getty Images

Talking about data ethics, Europe is in the middle of a superpower pincer. China’s telecom giant Huawei has made expansion on the continent a major priority, while the United States has been sending delegation after delegation to convince its Western allies to reject Chinese equipment. The dilemma was quite visible last week at MWC-Barcelona, where the two sides each tried to make their case.

It’s been years since the Snowden revelations showed that the United States was operating an enormous eavesdropping infrastructure targeting countries throughout the world, including across Europe. Huawei has reiterated its stance that it does not steal information from its equipment, and has repeated its demands that the Trump administration provide public proof of flaws in its security.

There is an abundance of moral relativism here, but I see this as increasingly a litmus test of the West on China. China has not hidden its ambitions to take a prime role in East Asia, nor has it hidden its intentions to build a massive surveillance network over its own people or to influence the media overseas.

Those tactics, though, are straight out of the American playbook, which lost its moral legitimacy over the past two decades from some combination of the Iraq War, Snowden, Wikileaks, and other public scandals that have undermined trust in the country overseas.

Security and privacy might have been a competitive advantage for American products over their Chinese counterparts, but that advantage has been weakened for many countries to near zero. We are increasingly going to see countries choose a mix of Chinese and American equipment in sensitive applications, if only to ensure that if one country is going to steal their data, it might as well be balanced.

Things that seem interesting that I haven’t read yet

Obsessions

  • Perhaps some more challenges around data usage and algorithmic accountability
  • We have a bit of a theme around emerging markets, macroeconomics, and the next set of users to join the internet.
  • More discussion of megaprojects, infrastructure, and “why can’t we build things”

Thanks

To every member of Extra Crunch: thank you. You allow us to get off the ad-laden media churn conveyor belt and spend quality time on amazing ideas, people, and companies. If I can ever be of assistance, hit reply, or send an email to danny@techcrunch.com.

This newsletter is written with the assistance of Arman Tabatabai from New York.

You’re reading the Extra Crunch Daily. Like this newsletter? Subscribe for free to follow all of our discussions and debates.

Feb
21
2019
--

JFrog acquires Shippable, adding continuous integration and delivery to its DevOps platform

JFrog, the popular DevOps startup now valued at more than $1 billion after raising $165 million last October, is making a move to expand the tools and services it provides to developers on its software operations platform: it has acquired Shippable, a cloud-based continuous integration and delivery platform (CI/CD) that developers use to ship code and deliver app and microservices updates, and plans to integrate it into its Enterprise+ platform.

Terms of the deal — JFrog’s fifth acquisition — are not being disclosed, said Shlomi Ben Haim, JFrog’s co-founder and CEO, in an interview. From what I understand, though, it was in the ballpark of Shippable’s most recent valuation, which was $42.6 million back in 2014 when it raised $8 million, according to PitchBook data.  (And that was the last time it raised money.)

Shippable employees are joining JFrog and plan to release the first integrations with Enterprise+ this coming summer, and a full integration by Q3 of this year.

Shippable, founded in 2013, made its name early on as a provider of a containerized continuous integration and delivery platform based on Docker containers, but as Kubernetes has overtaken Docker in containerized deployments, the startup had also shifted its focus beyond Docker containers.

The acquisition speaks to the consolidation that is afoot in the world of DevOps, where developers and organizations are looking for more end-to-end toolkits, not just to help develop, update and run their apps and microservices, but to provide security and more — or at least, makers of DevOps tools hope they will be, as they themselves look to grow their margins and business.

As more organizations run ever more of their operations as apps and microservices, DevOps have risen in prominence and are offered both toolkits from standalone businesses as well as those whose infrastructure is touched and used by DevOps tools. That means a company like JFrog has an expanding pool of competitors that include not just the likes of Docker, Sonatype and GitLab, but also AWS, Google Cloud Platform and Azure and “the Red Hats of the world,” in the words of Ben Haim.

For Shippable customers, the integration will give them access to security, binary management and other enterprise development tools.

“We’re thrilled to join the JFrog family and further the vision around Liquid Software,” said Avi Cavale, founder and CEO of Shippable, in a statement. “Shippable users and customers have long enjoyed our next-generation technology, but now will have access to leading security, binary management and other high-powered enterprise tools in the end-to-end JFrog Platform. This is truly exciting, as the combined forces of JFrog and Shippable can make full DevOps automation from code to production a reality.”

On the part of JFrog, the company will be using Shippable to provide a native CI/CD tool directly within JFrog.

“Before most of our users would use Jenkins, Circle CI and other CI/CD automation tools,” Ben Haim said. “But what you are starting to see in the wider market is a gradual consolidation of CI tools into code repository.”

He emphasized that this will not mean any changes for developers who are already happy using Jenkins or other integrations: just that it will now be offering a native solution that will be offered alongside these (presumably both with easier functionality and with competitive pricing).

JFrog today has 5,000 paying customers, up from 4,500 in October, including “most of the Fortune 500,” with marquee customers including the likes of Apple and Adobe, but also banks, healthcare organizations and insurance companies — “conservative businesses,” said Ben Haim, that are also now realizing the importance of using DevOps.

Feb
21
2019
--

Redis Labs changes its open-source license — again

Redis Labs, fresh off its latest funding round, today announced a change to how it licenses its Redis Modules. This may not sound like a big deal, but in the world of open-source projects, licensing is currently a big issue. That’s because organizations like Redis, MongoDB, Confluent and others have recently introduced new licenses that make it harder for their competitors to take their products and sell them as rebranded services without contributing back to the community (and most of these companies point directly at AWS as the main offender here).

“Some cloud providers have repeatedly taken advantage of successful opensource projects, without significant contributions to their communities,” the Redis Labs team writes today. “They repackage software that was not developed by them into competitive, proprietary service offerings and use their business leverage to reap substantial revenues from these open source projects.”

The point of these new licenses it to put a stop to this.

This is not the first time Redis Labs has changed how it licenses its Redis Modules (and I’m stressing the “Redis Modules” part here because this is only about modules from Redis Labs and does not have any bearing on how the Redis database project itself is licensed). Back in 2018, Redis Labs changed its license from AGPL to Apache 2 modified with Commons Clause. The “Commons Clause” is the part that places commercial restrictions on top of the license.

That created quite a stir, as Redis Labs co-founder and CEO Ofer Bengal told me a few days ago when we spoke about the company’s funding.

“When we came out with this new license, there were many different views,” he acknowledged. “Some people condemned that. But after the initial noise calmed down — and especially after some other companies came out with a similar concept — the community now understands that the original concept of open source has to be fixed because it isn’t suitable anymore to the modern era where cloud companies use their monopoly power to adopt any successful open source project without contributing anything to it.”

The way the code was licensed, though, created a bit of confusion, the company now says, because some users thought they were only bound by the terms of the Apache 2 license. Some terms in the Commons Clause, too, weren’t quite clear (including the meaning of “substantial,” for example).

So today, Redis Labs is introducing the Redis Source Available License. This license, too, only applies to certain Redis Modules created by Redis Labs. Users can still get the code, modify it and integrate it into their applications — but that application can’t be a database product, caching engine, stream processing engine, search engine, indexing engine or ML/DL/AI serving engine.

By definition, an open-source license can’t have limitations. This new license does, so it’s technically not an open-source license. In practice, the company argues, it’s quite similar to other permissive open-source licenses, though, and shouldn’t really affect most developers who use the company’s modules (and these modules are RedisSearch, RedisGraph, RedisJSON, RedisML and RedisBloom).

This is surely not the last we’ve heard of this. Sooner or later, more projects will follow the same path. By then, we’ll likely see more standard licenses that address this issue so other companies won’t have to change multiple times. Ideally, though, we won’t need it because everybody will play nice — but since we’re not living in a utopia, that’s not likely to happen.

Feb
20
2019
--

Google’s hybrid cloud platform is now in beta

Last July, at its Cloud Next conference, Google announced the Cloud Services Platform, its first real foray into bringing its own cloud services into the enterprise data center as a managed service. Today, the Cloud Services Platform (CSP) is launching into beta.

It’s important to note that the CSP isn’t — at least for the time being — Google’s way of bringing all of its cloud-based developer services to the on-premises data center. In other words, this is a very different project from something like Microsoft’s Azure Stack. Instead, the focus is on the Google Kubernetes Engine, which allows enterprises to then run their applications in both their own data centers and on virtually any cloud platform that supports containers.As Google Cloud engineering director Chen Goldberg told me, the idea here it to help enterprises innovate and modernize. “Clearly, everybody is very excited about cloud computing, on-demand compute and managed services, but customers have recognized that the move is not that easy,” she said and noted that the vast majority of enterprises are adopting a hybrid approach. And while containers are obviously still a very new technology, she feels good about this bet on the technology because most enterprises are already adopting containers and Kubernetes — and they are doing so at exactly the same time as they are adopting cloud and especially hybrid clouds.

It’s important to note that CSP is a managed platform. Google handles all of the heavy lifting like upgrades and security patches. And for enterprises that need an easy way to install some of the most popular applications, the platform also supports Kubernetes applications from the GCP Marketplace.

As for the tech itself, Goldberg stressed that this isn’t just about Kubernetes. The service also uses Istio, for example, the increasingly popular service mesh that makes it easier for enterprises to secure and control the flow of traffic and API calls between its applications.

With today’s release, Google is also launching its new CSP Config Management tool to help users create multi-cluster policies and set up and enforce access controls, resource quotas and more. CSP also integrates with Google’s Stackdriver Monitoring service and continuous delivery platforms.

“On-prem is not easy,” Goldberg said, and given that this is the first time the company is really supporting software in a data center that is not its own, that’s probably an understatement. But Google also decided that it didn’t want to force users into a specific set of hardware specifications like Azure Stack does, for example. Instead, CSP sits on top of VMware’s vSphere server virtualization platform, which most enterprises already use in their data centers anyway. That surely simplifies things, given that this is a very well-understood platform.

Feb
20
2019
--

Why Daimler moved its big data platform to the cloud

Like virtually every big enterprise company, a few years ago, the German auto giant Daimler decided to invest in its own on-premises data centers. And while those aren’t going away anytime soon, the company today announced that it has successfully moved its on-premises big data platform to Microsoft’s Azure cloud. This new platform, which the company calls eXtollo, is Daimler’s first major service to run outside of its own data centers, though it’ll probably not be the last.

As Daimler’s head of its corporate center of excellence for advanced analytics and big data Guido Vetter told me, the company started getting interested in big data about five years ago. “We invested in technology — the classical way, on-premise — and got a couple of people on it. And we were investigating what we could do with data because data is transforming our whole business as well,” he said.

By 2016, the size of the organization had grown to the point where a more formal structure was needed to enable the company to handle its data at a global scale. At the time, the buzz phrase was “data lakes” and the company started building its own in order to build out its analytics capacities.

Electric lineup, Daimler AG

“Sooner or later, we hit the limits as it’s not our core business to run these big environments,” Vetter said. “Flexibility and scalability are what you need for AI and advanced analytics and our whole operations are not set up for that. Our backend operations are set up for keeping a plant running and keeping everything safe and secure.” But in this new world of enterprise IT, companies need to be able to be flexible and experiment — and, if necessary, throw out failed experiments quickly.

So about a year and a half ago, Vetter’s team started the eXtollo project to bring all the company’s activities around advanced analytics, big data and artificial intelligence into the Azure Cloud, and just over two weeks ago, the team shut down its last on-premises servers after slowly turning on its solutions in Microsoft’s data centers in Europe, the U.S. and Asia. All in all, the actual transition between the on-premises data centers and the Azure cloud took about nine months. That may not seem fast, but for an enterprise project like this, that’s about as fast as it gets (and for a while, it fed all new data into both its on-premises data lake and Azure).

If you work for a startup, then all of this probably doesn’t seem like a big deal, but for a more traditional enterprise like Daimler, even just giving up control over the physical hardware where your data resides was a major culture change and something that took quite a bit of convincing. In the end, the solution came down to encryption.

“We needed the means to secure the data in the Microsoft data center with our own means that ensure that only we have access to the raw data and work with the data,” explained Vetter. In the end, the company decided to use the Azure Key Vault to manage and rotate its encryption keys. Indeed, Vetter noted that knowing that the company had full control over its own data was what allowed this project to move forward.

Vetter tells me the company obviously looked at Microsoft’s competitors as well, but he noted that his team didn’t find a compelling offer from other vendors in terms of functionality and the security features that it needed.

Today, Daimler’s big data unit uses tools like HD Insights and Azure Databricks, which covers more than 90 percents of the company’s current use cases. In the future, Vetter also wants to make it easier for less experienced users to use self-service tools to launch AI and analytics services.

While cost is often a factor that counts against the cloud, because renting server capacity isn’t cheap, Vetter argues that this move will actually save the company money and that storage costs, especially, are going to be cheaper in the cloud than in its on-premises data center (and chances are that Daimler, given its size and prestige as a customer, isn’t exactly paying the same rack rate that others are paying for the Azure services).

As with so many big data AI projects, predictions are the focus of much of what Daimler is doing. That may mean looking at a car’s data and error code and helping the technician diagnose an issue or doing predictive maintenance on a commercial vehicle. Interestingly, the company isn’t currently bringing to the cloud any of its own IoT data from its plants. That’s all managed in the company’s on-premises data centers because it wants to avoid the risk of having to shut down a plant because its tools lost the connection to a data center, for example.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com