Mar
30
2020
--

Turbo Systems hires former Looker CMO Jen Grant as CEO

Turbo Systems, a three-year old, no-code mobile app startup, announced today it has brought on industry veteran Jen Grant to be CEO.

Grant, who was previously vice president of marketing at Box and chief marketing officer at Elastic and Looker, brings more than 15 years of tech company experience to the young startup.

She says that when Looker got acquired by Google last June for $2.6 billion, she began looking for her next opportunity. She had done a stint with Google as a product manager earlier in her career and was looking for something new.

She saw Looker as a model for the kind of company she wanted to join, one that had a founder focused on product and engineering, who hired an outside CEO early on to run the business, as Looker had done. She found that in Turbo where founder Hari Subramanian was taking on that type of role. Subramanian was also a successful entrepreneur, having previously founded ServiceMax before selling it to GE in 2016.

“The first thing that really drew me to Turbo was this partnership with Hari,” Grant told TechCrunch. While that relationship was a key component for her, she says even with that, before she decided to join, she spoke to customers and she saw an enthusiasm there that drew her to the company.

“I love products that actually help people. And so Box is helping people collaborate and share files and work together. Looker is about getting data to everyone in the organization so that everyone could be making great decisions, and at Turbo we’re making it easy for anyone to create a mobile app that helps run their business,” she said.

Grant has been on the job for just 30 days, joining the company in the middle of a global pandemic. So it’s even more challenging than the typical early days for any new CEO, but she is looking forward and trying to help her 36 employees navigate this situation.

“You know, I didn’t know that this is what would happen in my first 30 days, but what inspires me, what’s a big part of it is that I can help by growing this company, by being successful and by being able to hire more and more people, and contribute to getting our economy back on track,” Grant said.

She also recognizes that there is a lack of diversity in her new CEO role, and she hopes to be a role model. “I have been fortunate to get to a position where I know I can do this job and do it well. And it’s my responsibility to do this work, my responsibility to show it can be done and shouldn’t be an anomaly.”

Turbo Systems was founded in 2017 and has raised $8 million, according to Crunchbase. It helps companies build mobile apps without coding, connecting to 140 different data sources such as Salesforce, SAP and Oracle.

Mar
26
2020
--

Tech giants should let startups defer cloud payments

Google, Amazon and Microsoft are the landlords. Amidst the coronavirus economic crisis, startups need a break from paying rent. They’re in a cash crunch. Revenue has stopped flowing in, capital markets like venture debt are hesitant and startups and small-to-medium sized businesses are at risk of either having to lay off huge numbers of employees and/or shut down.

Meanwhile, the tech giants are cash rich. Their success this decade means they’re able to weather the storm for a few months. Their customers cannot.

Cloud infrastructure costs area amongst many startups’ top expense besides payroll. The option to pay these cloud bills later could save some from going out of business or axing huge parts of their staff. Both would hurt the tech industry, the economy and the individuals laid off. But most worryingly for the giants, it could destroy their customer base.

The mass layoffs have already begun. Soon we’re sure to start hearing about sizable companies shutting down, upended by COVID-19. But there’s still an opportunity to stop a larger bloodbath from ensuing.

That’s why I have a proposal: cloud relief.

The platform giants should let startups and small businesses defer their cloud infrastructure payments for three to six months until they can pay them back in installments. Amazon AWS, Google Cloud, Microsoft Azure, these companies’ additional infrastructure products, and other platform providers should let customers pause payment until the worst of the first wave of the COVID-19 economic disruption passes. Profitable SaaS providers like Salesforce could give customers an extension too.

There are plenty of altruistic reasons to do this. They have the resources to help businesses in need. We all need to support each other in these tough times. This could protect tons of families. Some of these startups are providing important services to the public and even discounting them, thereby ramping up their bills while decreasing revenue.

Then there are the PR reasons. After years of techlash and anti-trust scrutiny, here’s the chance for the giants to prove their size can be beneficial to the world. Recruiters could use it as a talking point. “We’re the company that helped save Silicon Valley.” There’s an explanation for them squirreling away so much cash: the rainy day has finally arrived.

But the capitalistic truth and the story they could sell to Wall Street is that it’s not good for our business if our customers go out of business. Look at what happened to infrastructure providers in the dot-com crash. When tons of startups vaporized, so did the profits for those selling them hosting and tools. Any government stimulus for businesses would be better spent by them paying employees than paying the cloud companies that aren’t in danger. Saving one future Netflix from shutting down could cover any short-term loss from helping 100 other businesses.

This isn’t a handout. These startups will still owe the money. They’d just be able to pay it a little later, spread out over their monthly bills for a year or so. Once mass shelter-in-place orders subside, businesses can operate at least a little closer to normal, investors can get less cautious and customers will have the cash they need to pay their dues. Plus interest, if necessary.

Meanwhile, they’ll be locked in and loyal customers for the foreseeable future. Cloud vendors could gate the deferment to only customers that have been with them for X amount of months or that have already spent Y amount on the platform. The vendors also could offer the deferment on the condition that customers add a year or more to their existing contracts. Founders will remember who gave them the benefit of the doubt.

cloud ice cream cone imagine

Consider it a marketing expense. Platforms often offer discounts or free trials to new customers. Now it’s existing customers that need a reprieve. Instead of airport ads, the giants could spend the money ensuring they’ll still have plenty of developers building atop them by the end of 2020.

Beyond deferred payment, platforms could just push the due date on all outstanding bills to three or six months from now. Alternatively, they could offer a deep discount such as 50% off for three months if they didn’t want to deal with accruing debt and then servicing it. Customers with multi-year contracts could offered the opportunity to downgrade or renegotiate their contracts without penalties. Any of these might require giving sales quota forgiveness to their account executives.

It would likely be far too complicated and risky to accept equity in lieu of cash, a cut of revenue going forward or to provide loans or credit lines to customers. The clearest and simplest solution is to let startups skip a few payments, then pay more every month later until they clear their debt. When asked for comment or about whether they’re considering payment deferment options, Microsoft declined, and Amazon and Google did not respond.

To be clear, administering payment deferment won’t be simple or free. There are sure to be holes that cloud economists can poke in this proposal, but my goal is to get the conversation started. It could require the giants to change their earnings guidance. Rewriting deals with significantly sized customers will take work on both ends, and there’s a chance of breach of contract disputes. Giants would face the threat of customers recklessly using cloud resources before shutting down or skipping town.

Most taxing would be determining and enforcing the criteria of who’s eligible. The vendors would need to lay out which customers are too big so they don’t accidentally give a cloud-intensive but healthy media company a deferment they don’t need. Businesses that get questionably excluded could make a stink in public. Executing on the plan will require staff when giants are stretched thin trying to handle logistics disruptions, misinformation and accelerating work-from-home usage.

Still, this is the moment when the fortunate need to lend a hand to the vulnerable. Not a hand out, but a hand up. Companies with billions in cash in their coffers could save those struggling to pay salaries. All the fundraisers and info centers and hackathons are great, but this is how the tech giants can live up to their lofty mission statements.

We all live in the cloud now. Don’t evict us. #CloudRelief

Thanks to Falon Fatemi, Corey Quinn, Ilya Fushman, Jason Kim, Ilya Sukhar and Michael Campbell for their ideas and feedback on this proposal.

Mar
24
2020
--

GitLab offers key lessons in running an all-remote workforce in new e-book

As companies that are used to having workers in the same building struggle to find ways to work from home, one company that has been remote from Day One is GitLab . It recently published a handbook to help other companies who are facing the work-from-home challenge for the first time.

Lest you think GitLab is a small organization, it’s not. It’s 1,200 employees strong, all of which work from home in a mind boggling 67 countries. And it’s doing well. In September, the company raised $268 million on a $2.75 billion valuation.

Given that it has found a way to make a decentralized company work, GitLab has decided to share the best practices they’ve built up over the years to help others just starting on this journey.

Among the key bits of advice in the 34-page report, perhaps the most important to note when you begin working apart is to document everything. GitLab has a reputation for hyper transparency, publishing everything from its 3-year business strategy to its projected IPO date for the world to see.

But it’s also about writing down policies and procedures and making them available to the remote workforce. When you’re not in the same building, you can’t simply walk up to someone’s cubicle and ask a question, so you need to be vigilant about documenting your processes in a handbook that is available online and searchable.

“By adopting a handbook-first approach, team members have ‘a single source of truth’ for answers. Even though documentation takes a little more time upfront, it prevents people from having to ask the same question repeatedly. Remote work is what led to the development of GitLab’s publicly viewable handbook,” the company wrote in the e-book.

That includes an on-boarding procedure because folks aren’t coming into a meeting with HR when they start at GitLab. It’s essential to have all the information new hires need in one place, and the company has worked hard to build on-boarding templates. They also offer remote GitLab 101 meetings to orient folks who need more face time to get going.

You would think when you work like this, meetings would be required, but GitLab suggests making meetings optional. That’s because people are spread across the world’s time zones, making it difficult to get everyone together at the same time. Instead, the company records meetings and brainstorms ideas, essentially virtual white-boarding in Google Docs.

Another key piece of advice is to align your values with a remote way of working. That means changing your management approach to fit the expectations of a remote workforce. “If your values are structured to encourage conventional colocated workplace norms (such as consensus gathering or recurring meetings with in-person teams), rewrite them. If values are inconsistent with the foundation of remote work, there’s bound to be disappointment and confusion. Values can set the right expectations and provide a clear direction for the company going forward,” the company wrote.

This is just scratching the surface of what’s in the handbook, but it’s a valuable resource for anyone who is trying to find a way to function in a remote work environment. Each company will have its own culture and way of dealing with this, of course, but when a company like GitLab, which was born remote, provides this level of advice, it pays to listen and take advantage of their many years of expertise.

Mar
20
2020
--

Google cancels I/O developer conference in light of COVID-19 crisis

Google announced on Twitter today that it was cancelling its annual I/O developer conference out of concern for the health and safety of all involved. It will not be holding any online conference in its place either.

“Out of concern for the health and safety of our developers, employees, and local communities — and in line with recent ‘shelter in place’ orders by the local Bay Area counties — we sadly will not be holding I/O in any capacity this year,” the company tweeted.

This is not a small deal, as Google uses this, and the Google Cloud Next conference, which it has also canceled, to let developers, customers, partners and other interested parties know about what new features, products and services they will be introducing in the coming year.

Without a major venue to announce these new tools, it will be harder for the company to get the word out about them or gain the power of human networking that these conferences provide. All of that is taking a backseat this year over concerns about the virus.

The company made clear that it does not intend to reschedule these events in person or in a virtual capacity at all this year, and will look for other ways to inform the community of changes, updates and new services in the coming months.

“Right now, the most important thing all of us can do is focus our attention on helping people with the new challenges we all face. Please know that we remain committed to finding other ways to share platform updates with you through our developer blogs and community forums,” the company wrote.

Mar
17
2020
--

Spectro Cloud launches with $7.5M investment to help developers build Kubernetes clusters their way

By now we know that Kubernetes is a wildly popular container management platform, but if you want to use it, you pretty much have to choose between having someone manage it for you or building it yourself. Spectro Cloud emerged from stealth today with a $7.5 million investment to give you a third choice that falls somewhere in the middle.

The funding was led by Sierra Ventures with participation from Boldstart Ventures.

Ed Sim, founder at Boldstart, says he liked the team and the tech. “Spectro Cloud is solving a massive pain that every large enterprise is struggling with: how to roll your own Kubernetes service on a managed platform without being beholden to any large vendor,” Sim told TechCrunch.

Spectro co-founder and CEO Tenry Fu says an enterprise should not have to compromise between control and ease of use. “We want to be the first company that brings an easy-to-use managed Kubernetes experience to the enterprise, but also gives them the flexibility to define their own Kubernetes infrastructure stacks at scale,” Fu explained.

Fu says that the stack, in this instance, consists of the base operating system to the Kubernetes version to the storage, networking and other layers like security, logging, monitoring, load balancing or anything that’s infrastructure related around Kubernetes.

“Within an organization in the enterprise you can serve the needs of your various groups, down to pretty granular level with respect to what’s in your infrastructure stack, and then you don’t have to worry about lifecycle management,” he explained. That’s because Spectro Cloud handles that for you, while still giving you that control.

That gives enterprise developers greater deployment flexibility and the ability to move between cloud infrastructure providers more easily, something that is top of mind today as companies don’t want to be locked into a single vendor.

“There’s an infrastructure control continuum that forces enterprises into trade-offs against these needs. At one extreme, the managed offerings offer a kind of nirvana around ease of use, but it’s at the expense of control over things like the cloud that you’re on or when you adopt new ecosystem options like updated versions of Kubernetes.”

Fu and his co-founders have a deep background in this, having previously been part of CliQr, a company that helped customers manage applications across hybrid cloud environments. They sold that company to Cisco in 2016 and began developing Spectro Cloud last spring.

It’s early days, but the company has been working with 16 beta customers.

Mar
11
2020
--

AWS launches Bottlerocket, a Linux-based OS for container hosting

AWS has launched its own open-source operating system for running containers on both virtual machines and bare metal hosts. Bottlerocket, as the new OS is called, is basically a stripped-down Linux distribution that’s akin to projects like CoreOS’s now-defunct Container Linux and Google’s container-optimized OS. The OS is currently in its developer preview phase, but you can test it as an Amazon Machine Image for EC2 (and by extension, under Amazon EKS, too).

As AWS chief evangelist Jeff Barr notes in his announcement, Bottlerocket supports Docker images and images that conform to the Open Container Initiative image format, which means it’ll basically run all Linux-based containers you can throw at it.

One feature that makes Bottlerocket stand out is that it does away with a package-based update system. Instead, it uses an image-based model that, as Barr notes, “allows for a rapid & complete rollback if necessary.” The idea here is that this makes updates easier. At the core of this update process is “The Update Framework,” an open-source project hosted by the Cloud Native Computing Foundation.

AWS says it will provide three years of support (after General Availability) for its own builds of Bottlerocket. As of now, the project is very much focused on AWS, of course, but the code is available on GitHub and chances are we will see others expand on AWS’ work.

The company is launching the project in cooperation with a number of partners, including Alcide, Armory, CrowdStrike, Datadog, New Relic, Sysdig, Tigera, Trend Micro and Waveworks.

“Container-optimized operating systems will give dev teams the additional speed and efficiency to run higher throughput workloads with better security and uptime,” said Michael Gerstenhaber, director of Product Management at Datadog.” We are excited to work with AWS on Bottlerocket, so that as customers take advantage of the increased scale they can continue to monitor these ephemeral environments with confidence.”

 

Mar
04
2020
--

Google Cloud announces four new regions as it expands its global footprint

Google Cloud today announced its plans to open four new data center regions. These regions will be in Delhi (India), Doha (Qatar), Melbourne (Australia) and Toronto (Canada) and bring Google Cloud’s total footprint to 26 regions. The company previously announced that it would open regions in Jakarta, Las Vegas, Salt Lake City, Seoul and Warsaw over the course of the next year. The announcement also comes only a few days after Google opened its Salt Lake City data center.

GCP already had a data center presence in India, Australia and Canada before this announcement, but with these newly announced regions, it now offers two geographically separate regions for in-country disaster recovery, for example.

Google notes that the region in Doha marks the company’s first strategic collaboration agreement to launch a region in the Middle East with the Qatar Free Zones Authority. One of the launch customers there is Bespin Global, a major managed services provider in Asia.

“We work with some of the largest Korean enterprises, helping to drive their digital transformation initiatives. One of the key requirements that we have is that we need to deliver the same quality of service to all of our customers around the globe,” said John Lee, CEO, Bespin Global. “Google Cloud’s continuous investments in expanding their own infrastructure to areas like the Middle East make it possible for us to meet our customers where they are.”

Mar
04
2020
--

Netlify nabs $53M Series C as microservices approach to web development grows

Netlify, the startup that wants to kill the web server and change the way developers build websites, announced a $53 million Series C today.

EQT Ventures Fund led the round with contributions from existing investors Andreessen Horowitz and Kleiner Perkins and newcomer Preston-Werner Ventures. Under the terms of the deal Laura Yao, deal partner and investment advisor at EQT Ventures will be joining the Netlify board. The startup has now raised $97 million, according to the company.

Like many startups recently, Netlify’s co-founder Chris Bach says they weren’t looking for new funding, but felt with the company growing rapidly, it would be prudent to take the money to help continue that growth.

While Bach and CEO Matt Biilmann didn’t want to discuss valuation, they said it was “very generous” and in line with how they see their business. Neither did they want to disclose specific revenue figures, but did say that the company has tripled revenue three years running.

One thing fueling that growth is the sheer number of developers joining the platform. When we spoke to the company for its Series B in 2018, it had 300,000 sign-ups. Today that number has ballooned to 800,000.

As we wrote about the company in a 2018 article, it wants to change the way people develop web sites:

“Netlify has abstracted away the concept of a web server, which it says is slow to deploy and hard to secure and scale. By shifting from a monolithic website to a static front end with back-end microservices, it believes it can solve security and scaling issues and deliver the site much faster.”

While developer popularity is a good starting point, getting larger customers on board is the ultimate goal that will drive more revenue, and the company wants to use its new injection of capital to build the enterprise side of the business. Current enterprise customers include Google, Facebook, Citrix and Unilever.

Netlify has grown from 38 to 97 employees since the beginning of last year and hopes to reach 180 by year’s end.

Mar
03
2020
--

Datastax acquires The Last Pickle

Data management company Datastax, one of the largest contributors to the Apache Cassandra project, today announced that it has acquired The Last Pickle (and no, I don’t know what’s up with that name either), a New Zealand-based Cassandra consulting and services firm that’s behind a number of popular open-source tools for the distributed NoSQL database.

As Datastax Chief Strategy Officer Sam Ramji, who you may remember from his recent tenure at Apigee, the Cloud Foundry Foundation, Google and Autodesk, told me, The Last Pickle is one of the premier Apache Cassandra consulting and services companies. The team there has been building Cassandra-based open source solutions for the likes of Spotify, T Mobile and AT&T since it was founded back in 2012. And while The Last Pickle is based in New Zealand, the company has engineers all over the world that do the heavy lifting and help these companies successfully implement the Cassandra database technology.

It’s worth mentioning that Last Pickle CEO Aaron Morton first discovered Cassandra when he worked for WETA Digital on the special effects for Avatar, where the team used Cassandra to allow the VFX artists to store their data.

“There’s two parts to what they do,” Ramji explained. “One is the very visible consulting, which has led them to become world experts in the operation of Cassandra. So as we automate Cassandra and as we improve the operability of the project with enterprises, their embodied wisdom about how to operate and scale Apache Cassandra is as good as it gets — the best in the world.” And The Last Pickle’s experience in building systems with tens of thousands of nodes — and the challenges that its customers face — is something Datastax can then offer to its customers as well.

And Datastax, of course, also plans to productize The Last Pickle’s open-source tools like the automated repair tool Reaper and the Medusa backup and restore system.

As both Ramji and Datastax VP of Engineering Josh McKenzie stressed, Cassandra has seen a lot of commercial development in recent years, with the likes of AWS now offering a managed Cassandra service, for example, but there wasn’t all that much hype around the project anymore. But they argue that’s a good thing. Now that it is over ten years old, Cassandra has been battle-hardened. For the last ten years, Ramji argues, the industry tried to figure out what the de factor standard for scale-out computing should be. By 2019, it became clear that Kubernetes was the answer to that.

“This next decade is about what is the de facto standard for scale-out data? We think that’s got certain affordances, certain structural needs and we think that the decades that Cassandra has spent getting harden puts it in a position to be data for that wave.”

McKenzie also noted that Cassandra provides users with a number of built-in features like support for mutiple data centers and geo-replication, rolling updates and live scaling, as well as wide support across programming languages, give it a number of advantages over competing databases.

“It’s easy to forget how much Cassandra gives you for free just based on its architecture,” he said. “Losing the power in an entire datacenter, upgrading the version of the database, hardware failing every day? No problem. The cluster is 100 percent always still up and available. The tooling and expertise of The Last Pickle really help bring all this distributed and resilient power into the hands of the masses.”

The two companies did not disclose the price of the acquisition.

Feb
27
2020
--

London-based Gyana raises $3.9M for a no-code approach to data science

Coding and other computer science expertise remain some of the more important skills that a person can have in the working world today, but in the last few years, we have also seen a big rise in a new generation of tools providing an alternative way of reaping the fruits of technology: “no-code” software, which lets anyone — technical or non-technical — build apps, games, AI-based chatbots, and other products that used to be the exclusive terrain of engineers and computer scientists.

Today, one of the newer startups in the category — London-based Gyana, which lets non-technical people run data science analytics on any structured dataset — is announcing a round of £3 million to fuel its next stage of growth.

Led by U.K. firm Fuel Ventures, other investors in this round include Biz Stone of Twitter, Green Shores Capital and U+I , and it brings the total raised by the startup to $6.8 million since being founded in 2015.

Gyana (Sanskrit for “knowledge”) was co-founded by Joyeeta Das and David Kell, who were both pursuing post-graduate degrees at Oxford: Das, a former engineer, was getting an MBA, and Kell was doing a Ph. D. in physics.

Das said the idea of building this tool came out of the fact that the pair could see a big disconnect emerging not just in their studies, but also in the world at large — not so much a digital divide, as a digital light year in terms of the distance between the groups of who and who doesn’t know how to work in the realm of data science.

“Everyone talks about using data to inform decision making, and the world becoming data-driven, but actually that proposition is available to less than one percent of the world,” she said.

Out of that, the pair decided to work on building a platform that Das describes as a way to empower “citizen data scientists,” by letting users upload any structured data set (for example, a .CSV file) and running a series of queries on it to be able to visualise trends and other insights more easily.

While the longer term goal may be for any person to be able to produce an analytical insight out of a long list of numbers, the more practical and immediate application has been in enterprise services and building tools for non-technical knowledge workers to make better, data-driven decisions.

To prove out its software, the startup first built an app based on the platform that it calls Neera (Sanskrit for “water”), which specifically parses footfall and other “human movement” metrics, useful for applications in retail, real estate and civic planning — for example to determine well certain retail locations are performing, footfall in popular locations, decisions on where to place or remove stores, or how to price a piece of property.

Starting out with the aim of mid-market and smaller companies — those most likely not to have in-house data scientists to meet their business needs — startup has already picked up a series of customers that are actually quite a lot bigger than that. They include Vodafone, Barclays, EY, Pret a Manger, Knight Frank and the UK Ministry of Defense. It says it has some £1 million in contracts with these firms currently.

That, in turn, has served as the trigger to raise this latest round of funding and to launch Vayu (Sanskrit for “air”) — a more general purpose app that covers a wider set of parameters that can be applied to a dataset. So far, it has been adopted by academic researchers, financial services employees, and others that use analysis in their work, Das said.

With both Vayu and Neera, the aim — refreshingly — is to make the whole experience as privacy-friendly as possible, Das noted. Currently, you download an app if you want to use Gyana, and you keep your data local as you work on it. Gyana has no “anonymization” and no retention of data in its processes, except things like analytics around where your cursor hovers, so that Gyana knows how it can improve its product.

“There are always ways to reverse engineer these things,” Das said of anonymization. “We just wanted to make sure that we are not accidentally creating a situation where, despite learning from anaonyised materials, you can’t reverse engineer what people are analysing. We are just not convinced.”

While there is something commendable about building and shipping a tool with a lot of potential to it, Gyana runs the risk of facing what I think of as the “water, water everywhere” problem. Sometimes if a person really has no experience or specific aim, it can be hard to think of how to get started when you can do anything. Das said they have also identified this, and so while currently Gyana already offers some tutorials and helper tools within the app to nudge the user along, the plan is to eventually bring in a large variety of datasets for people to get started with, and also to develop a more intuitive way to “read” the basics of the files in order to figure out what kinds of data inquiries a person is most likely to want to make.

The rise of “no-code” software has been a swift one in the world of tech spanning the proliferation of startups, big acquisitions, and large funding rounds. Companies like Airtable and DashDash are aimed at building analytics leaning on interfaces that follow the basic design of a spreadsheet; AppSheet, which is a no-code mobile app building platform, was recently acquired by Google; and Roblox (for building games without needing to code) and Uncorq (for app development) have both raised significant funding just this week. In the area of no-code data analytics and visualisation, there are biggies like Tableau, as well as Trifacta, RapidMiner and more.

Gartner predicts that by 2024, some 65% of all app development will be made on low- or no-code platforms, and Forrester estimates that the no- and low-code market will be worth some $10 billion this year, rising to $21.2 billion by 2024.

That represents a big business opportunity for the likes of Gyana, which has been unique in using the no-code approach specifically to tackle the area of data science.

However, in the spirit of citizen data scientists, the intention is to keep a consumer version of the apps free to use as it works on signing up enterprise users with more enhanced paid products, which will be priced on an annual license basis (currently clients are paying between $6,000 and $12,000 depending on usage, she said).

“We want to do free for as long as we can,” Das said, both in relation to the data tools and the datasets that it will offer to users. “The biggest value add is not about accessing premium data that is hard to get. We are not a data marketplace but we want to provide data that makes sense to access,” adding that even with business users, “we’d like you to do 90% of what you want to do without paying for anything.”

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com