Aug
21
2020
--

As the pandemic creates supply chain chaos, Craft raises $10M to apply some intelligence

During the COVID-19 pandemic, supply chains have suddenly become hot. Who knew that would ever happen? The race to secure PPE, ventilators and minor things like food was and still is an enormous issue. But perhaps, predictably, the world of “supply chain software” could use some updating. Most of the platforms are deployed “empty” and require the client to populate them with their own data, or “bring their own data.” The UIs can be outdated and still have to be juggled with manual and offline workflows. So startups working in this space are now attracting some timely attention.

Thus, Craft, the enterprise intelligence company, today announces it has closed a $10 million Series A financing round to build what it characterizes as a “supply chain intelligence platform.” With the new funding, Craft will expand its offices in San Francisco, London and Minsk, and grow remote teams across engineering, sales, marketing and operations in North America and Europe.

It competes with some large incumbents, such as Dun & Bradstreet, Bureau van Dijk and Thomson Reuters . These are traditional data providers focused primarily on providing financial data about public companies, rather than real-time data from data sources such as operating metrics, human capital and risk metrics.

The idea is to allow companies to monitor and optimize their supply chain and enterprise systems. The financing was led by High Alpha Capital, alongside Greycroft. Craft also has some high-flying angel investors, including Sam Palmisano, chairman of the Center for Global Enterprise and former CEO and chairman of IBM; Jim Moffatt, former CEO of Deloitte Consulting; Frederic Kerrest, executive vice chairman, COO and co-founder of Okta; and Uncork Capital, which previously led Craft’s seed financing. High Alpha partner Kristian Andersen is joining Craft’s board of directors.

The problem Craft is attacking is a lack of visibility into complex global supply chains. For obvious reasons, COVID-19 disrupted global supply chains, which tended to reveal a lot of risks, structural weaknesses across industries and a lack of intelligence about how it’s all holding together. Craft’s solution is a proprietary data platform, API and portal that integrates into existing enterprise workflows.

While many business intelligence products require clients to bring their own data, Craft’s data platform comes pre-deployed with data from thousands of financial and alternative sources, such as 300+ data points that are refreshed using both Machine Learning and human validation. Its open-to-the-web company profiles appear in 50 million search results, for instance.

Ilya Levtov, co-founder and CEO of Craft, said in a statement: “Today, we are focused on providing powerful tracking and visibility to enterprise supply chains, while our ultimate vision is to build the intelligence layer of the enterprise technology stack.”

Kristian Andersen, partner with High Alpha commented: “We have a deep conviction that supply chain management remains an underinvested and under-innovated category in enterprise software.”

In the first half of 2020, Craft claims its revenues have grown nearly threefold, with Fortune 100 companies, government and military agencies, and SMEs among its clients.

Mar
16
2018
--

With great tech success, comes even greater responsibility

As we watch major tech platforms evolve over time, it’s clear that companies like Facebook, Apple, Google and Amazon (among others) have created businesses that are having a huge impact on humanity — sometimes positive and other times not so much.

That suggests that these platforms have to understand how people are using them and when they are trying to manipulate them or use them for nefarious purposes — or the companies themselves are. We can apply that same responsibility filter to individual technologies like artificial intelligence and indeed any advanced technologies and the impact they could possibly have on society over time.

This was a running theme this week at the South by Southwest conference in Austin, Texas.

The AI debate rages on

While the platform plays are clearly on the front lines of this discussion, tech icon Elon Musk repeated his concerns about AI running amok in a Q&A at South by Southwest. He worries that it won’t be long before we graduate from the narrow (and not terribly smart) AI we have today to a more generalized AI. He is particularly concerned that a strong AI could develop and evolve over time to the point it eventually matches the intellectual capabilities of humans. Of course, as TechCrunch’s Jon Shieber wrote, Musk sees his stable of companies as a kind of hedge against such a possible apocalypse.

Elon Musk with Jonathan Nolan at South by Southwest 2018. Photo: Getty Images/Chris Saucedo

“Narrow AI is not a species-level risk. It will result in dislocation… lost jobs… better weaponry and that sort of thing. It is not a fundamental, species-level risk, but digital super-intelligence is,” he told the South by Southwest audience.

He went so far as to suggest it could be more of a threat than nuclear warheads in terms of the kind of impact it could have on humanity.

Taking responsibility

Whether you agree with that assessment or not, or even if you think he is being somewhat self-serving with his warnings to promote his companies, he could be touching upon something important about corporate responsibility around the technology that startups and established companies alike should heed.

It was certainly on the mind of Apple’s Eddy Cue, who was interviewed on stage at SXSW by CNN’s Dylan Byers this week. “Tech is a great thing and makes humans more capable, but in of itself is not for good. People who make it, have to make it for good,” Cue said.

We can be sure that Twitter’s creators never imagined a world where bots would be launched to influence an election when they created the company more than a decade ago. Over time though, it becomes crystal clear that Twitter, and indeed all large platforms, can be used for a variety of motivations, and the platforms have to react when they think there are certain parties who are using their networks to manipulate parts of the populace.

Apple’s Eddie Cue speaking at South by Southwest 2018. Photo: Ron Miller

Cue dodged any of Byers’ questions about competing platforms, saying he could only speak to what Apple was doing because he didn’t have an inside view of companies like Facebook and Google (which he didn’t ever actually mention by name). “I think our company is different than what you’re talking about. Our customers’ privacy is of utmost importance to us,” he said. That includes, he said, limiting the amount of data they collect because they are not worrying about having enough to serve more meaningful ads. “We don’t care where you shop or what you buy,” he added.

Andy O’Connell from Facebook’s Global Policy Development team, speaking on a panel on the challenges of using AI to filter “fake news” said, that Facebook recognizes it can and should play a role if it sees people manipulating the platform. “This is a whole society issue, but there are technical things we are doing and things we can invest in [to help lessen the impact of fake news],” he said. He added that Facebook co-founder and CEO Mark Zuckerberg has expressed it as challenge to the company to make the platform more secure and that includes reducing the amount of false or misleading news that makes it onto the platform.

Recognizing tech’s limitations

As O’Connell put forth, this is not just a Facebook problem or a general technology problem. It’s a social problem and society as a whole needs to address it. Sometimes tech can help, but, we can’t always look to tech to solve every problem. The trouble is that we can never really anticipate how a given piece of technology will behave or how people use it once we put it out there.

Photo: Ron Miller

All of this suggests that none of these problems, some of which we never could have never have even imagined, are easy to solve. For every action and reaction, there can be another set of unintended consequences, even with the best of intentions.

But it’s up to the companies who are developing the tech to recognize the responsibility that comes with great economic success or simply the impact of whatever they are creating could have on society. “Everyone has a responsibility [to draw clear lines]. It is something we do and how we want to run our company. In today’s world people have to take responsibility and we intend to do that,” Cue said.

It’s got to be more than lip service though. It requires thought and care and reacting when things do run amok, while continually assessing the impact of every decision.

Oct
11
2017
--

ROSS Intelligence lands $8.7M Series A to speed up legal research with AI

 Armed with an understanding of machine learning, ROSS Intelligence is going after LexisNexis and Thomson Reuters for ownership of legal research. The startup, founded in 2015 by Andrew Arruda, Jimoh Ovbiagele and Pargles Dall’Oglio at the University of Toronto, is announcing an $8.7 million Series A today led by iNovia Capital with participation from Comcast Ventures Catalyst Fund,… Read More

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com