Oct
12
2018
--

IBM files formal JEDI protest a day before bidding process closes

IBM announced yesterday that it has filed a formal protest with the U.S. Government Accountability Office over the structure of the Pentagon’s winner-take-all $10 billion, 10-year JEDI cloud contract. The protest came just a day before the bidding process is scheduled to close. As IBM put it in a blog post, they took issues with the single vendor approach. They are certainly not alone.

Just about every vendor short of Amazon, which has remained mostly quiet, has been complaining about this strategy. IBM certainly faces a tough fight going up against Amazon and Microsoft.

IBM doesn’t disguise the fact that it thinks the contract has been written for Amazon to win and they believe the one-vendor approach simply doesn’t make sense. “No business in the world would build a cloud the way JEDI would and then lock in to it for a decade. JEDI turns its back on the preferences of Congress and the administration, is a bad use of taxpayer dollars and was written with just one company in mind.” IBM wrote in the blog post explaining why it was protesting the deal before a decision was made or the bidding was even closed.

For the record, DOD spokesperson Heather Babb told TechCrunch last month that the bidding is open and no vendor is favored. “The JEDI Cloud final RFP reflects the unique and critical needs of DOD, employing the best practices of competitive pricing and security. No vendors have been pre-selected,” she said.

Much like Oracle, which filed a protest of its own back in August, IBM is a traditional vendor that was late to the cloud. It began a journey to build a cloud business in 2013 when it purchased Infrastructure as a Service vendor SoftLayer and has been using its checkbook to buy software services to add on top of SoftLayer ever since. IBM has concentrated on building cloud services around AI, security, big data, blockchain and other emerging technologies.

Both IBM and Oracle have a problem with the one-vendor approach, especially one that locks in the government for a 10-year period. It’s worth pointing out that the contract actually is an initial two-year deal with two additional three year options and a final two year option. The DOD has left open the possibility this might not go the entire 10 years.

It’s also worth putting the contract in perspective. While 10 years and $10 billion is nothing to sneeze at, neither is it as market altering as it might appear, not when some are predicting the cloud will be $100 billion a year market very soon.

IBM uses the blog post as a kind of sales pitch as to why it’s a good choice, while at the same time pointing out the flaws in the single vendor approach and complaining that it’s geared toward a single unnamed vendor that we all know is Amazon.

The bidding process closes today, and unless something changes as a result of these protests, the winner will be selected next April

Sep
29
2018
--

What each cloud company could bring to the Pentagon’s $10 B JEDI cloud contract

The Pentagon is going to make one cloud vendor exceedingly happy when it chooses the winner of the $10 billion, ten-year enterprise cloud project dubbed the Joint Enterprise Defense Infrastructure (or JEDI for short). The contract is designed to establish the cloud technology strategy for the military over the next 10 years as it begins to take advantage of current trends like Internet of Things, artificial intelligence and big data.

Ten billion dollars spread out over ten years may not entirely alter a market that’s expected to reach $100 billion a year very soon, but it is substantial enough give a lesser vendor much greater visibility, and possibly deeper entree into other government and private sector business. The cloud companies certainly recognize that.

Photo: Glowimages/Getty Images

That could explain why they are tripping over themselves to change the contract dynamics, insisting, maybe rightly, that a multi-vendor approach would make more sense.

One look at the Request for Proposal (RFP) itself, which has dozens of documents outlining various criteria from security to training to the specification of the single award itself, shows the sheer complexity of this proposal. At the heart of it is a package of classified and unclassified infrastructure, platform and support services with other components around portability. Each of the main cloud vendors we’ll explore here offers these services. They are not unusual in themselves, but they do each bring a different set of skills and experiences to bear on a project like this.

It’s worth noting that it’s not just interested in technical chops, the DOD is also looking closely at pricing and has explicitly asked for specific discounts that would be applied to each component. The RFP process closes on October 12th and the winner is expected to be chosen next April.

Amazon

What can you say about Amazon? They are by far the dominant cloud infrastructure vendor. They have the advantage of having scored a large government contract in the past when they built the CIA’s private cloud in 2013, earning $600 million for their troubles. It offers GovCloud, which is the product that came out of this project designed to host sensitive data.

Jeff Bezos, Chairman and founder of Amazon.com. Photo: Drew Angerer/Getty Images

Many of the other vendors worry that gives them a leg up on this deal. While five years is a long time, especially in technology terms, if anything, Amazon has tightened control of the market. Heck, most of the other players were just beginning to establish their cloud business in 2013. Amazon, which launched in 2006, has maturity the others lack and they are still innovating, introducing dozens of new features every year. That makes them difficult to compete with, but even the biggest player can be taken down with the right game plan.

Microsoft

If anyone can take Amazon on, it’s Microsoft. While they were somewhat late the cloud they have more than made up for it over the last several years. They are growing fast, yet are still far behind Amazon in terms of pure market share. Still, they have a lot to offer the Pentagon including a combination of Azure, their cloud platform and Office 365, the popular business suite that includes Word, PowerPoint, Excel and Outlook email. What’s more they have a fat contract with the DOD for $900 million, signed in 2016 for Windows and related hardware.

Microsoft CEO, Satya Nadella Photo: David Paul Morris/Bloomberg via Getty Images

Azure Stack is particularly well suited to a military scenario. It’s a private cloud you can stand up and have a mini private version of the Azure public cloud. It’s fully compatible with Azure’s public cloud in terms of APIs and tools. The company also has Azure Government Cloud, which is certified for use by many of the U.S. government’s branches, including DOD Level 5. Microsoft brings a lot of experience working inside large enterprises and government clients over the years, meaning it knows how to manage a large contract like this.

Google

When we talk about the cloud, we tend to think of the Big Three. The third member of that group is Google. They have been working hard to establish their enterprise cloud business since 2015 when they brought in Diane Greene to reorganize the cloud unit and give them some enterprise cred. They still have a relatively small share of the market, but they are taking the long view, knowing that there is plenty of market left to conquer.

Head of Google Cloud, Diane Greene Photo: TechCrunch

They have taken an approach of open sourcing a lot of the tools they used in-house, then offering cloud versions of those same services, arguing that who knows better how to manage large-scale operations than they do. They have a point, and that could play well in a bid for this contract, but they also stepped away from an artificial intelligence contract with DOD called Project Maven when a group of their employees objected. It’s not clear if that would be held against them or not in the bidding process here.

IBM

IBM has been using its checkbook to build a broad platform of cloud services since 2013 when it bought Softlayer to give it infrastructure services, while adding software and development tools over the years, and emphasizing AI, big data, security, blockchain and other services. All the while, it has been trying to take full advantage of their artificial intelligence engine, Watson.

IBM Chairman, President and CEO Ginni Romett Photo: Ethan Miller/Getty Images

As one of the primary technology brands of the 20th century, the company has vast experience working with contracts of this scope and with large enterprise clients and governments. It’s not clear if this translates to its more recently developed cloud services, or if it has the cloud maturity of the others, especially Microsoft and Amazon. In that light, it would have its work cut out for it to win a contract like this.

Oracle

Oracle has been complaining since last spring to anyone who will listen, including reportedly the president, that the JEDI RFP is unfairly written to favor Amazon, a charge that DOD firmly denies. They have even filed a formal protest against the process itself.

That could be a smoke screen because the company was late to the cloud, took years to take it seriously as a concept, and barely registers today in terms of market share. What it does bring to the table is broad enterprise experience over decades and one of the most popular enterprise databases in the last 40 years.

Larry Ellison, chairman of Oracle Corp.

Larry Ellison, chairman of Oracle. Photo: David Paul Morris/Bloomberg via Getty Images

It recently began offering a self-repairing database in the cloud that could prove attractive to DOD, but whether its other offerings are enough to help it win this contract remains to be to be seen.

Sep
24
2018
--

Walmart is betting on the blockchain to improve food safety

Walmart has been working with IBM on a food safety blockchain solution and today it announced it’s requiring that all suppliers of leafy green vegetable for Sam’s and Walmart upload their data to the blockchain by September 2019 .

Most supply chains are bogged down in manual processes. This makes it difficult and time consuming to track down an issue should one like the E. coli romaine lettuce problem from last spring rear its head. By placing a supply chain on the blockchain, it makes the process more traceable, transparent and fully digital. Each node on the blockchain could represent an entity that has handled the food on the way to the store, making it much easier and faster to see if one of the affected farms sold infected supply to a particular location with much greater precision.

Walmart has been working with IBM for over a year on using the blockchain to digitize the food supply chain process. In fact, supply chain is one of the premiere business use cases for blockchain (beyond digital currency). Walmart is using the IBM Food Trust Solution, specifically developed for this use case.

“We built the IBM Food Trust solution using IBM Blockchain Platform, which is a tool or capability that IBM has built to help companies build, govern and run blockchain networks. It’s built using Hyperledger Fabric (the open source digital ledger technology) and it runs on IBM Cloud,” Bridget van Kralingen, IBM’s senior VP for Global Industries, Platforms and Blockchain explained.

Before moving the process to the blockchain, it typically took approximately 7 days to trace the source of food. With the blockchain, it’s been reduced to 2.2 seconds. That substantially reduces the likelihood  that infected food will reach the consumer.

Photo:  Shana Novak/Getty Images

One of the issues in a requiring the suppliers to put their information on the blockchain is understanding that there will be a range of approaches from paper to Excel spreadsheets to sophisticated ERP systems all uploading data to the blockchain. Walmart spokesperson Molly Blakeman says that this something they worked hard on with IBM to account for. Suppliers don’t have to be blockchain experts by any means. They simply have to know how to upload data to the blockchain application.

“IBM will offer an onboarding system that orients users with the service easily. Think about when you get a new iPhone – the instructions are easy to understand and you’re quickly up and running. That’s the aim here. Essentially, suppliers will need a smart device and internet to participate,” she said.

After working with it for a year, the company things it’s ready for broader implementation with the goal ultimately being making sure that the food that is sold at Walmart is safe for consumption, and if there is a problem, making auditing the supply chain a trivial activity.

“Our customers deserve a more transparent supply chain. We felt the one-step-up and one-step-back model of food traceability was outdated for the 21st century. This is a smart, technology-supported move that will greatly benefit our customers and transform the food system, benefitting all stakeholders,” Frank Yiannas, vice president of food safety for Walmart said in statement.

In addition to the blockchain requirement, the company is also requiring that suppliers adhere to one of the Global Food Safety Initiative (GFSI), which have been internationally recognized as food safety standards, according to the company.

Sep
19
2018
--

IBM launches cloud tool to detect AI bias and explain automated decisions

IBM has launched a software service that scans AI systems as they work in order to detect bias and provide explanations for the automated decisions being made — a degree of transparency that may be necessary for compliance purposes not just a company’s own due diligence.

The new trust and transparency system runs on the IBM cloud and works with models built from what IBM bills as a wide variety of popular machine learning frameworks and AI-build environments — including its own Watson tech, as well as Tensorflow, SparkML, AWS SageMaker, and AzureML.

It says the service can be customized to specific organizational needs via programming to take account of the “unique decision factors of any business workflow”.

The fully automated SaaS explains decision-making and detects bias in AI models at runtime — so as decisions are being made — which means it’s capturing “potentially unfair outcomes as they occur”, as IBM puts it.

It will also automatically recommend data to add to the model to help mitigate any bias that has been detected.

Explanations of AI decisions include showing which factors weighted the decision in one direction vs another; the confidence in the recommendation; and the factors behind that confidence.

IBM also says the software keeps records of the AI model’s accuracy, performance and fairness, along with the lineage of the AI systems — meaning they can be “easily traced and recalled for customer service, regulatory or compliance reasons”.

For one example on the compliance front, the EU’s GDPR privacy framework references automated decision making, and includes a right for people to be given detailed explanations of how algorithms work in certain scenarios — meaning businesses may need to be able to audit their AIs.

The IBM AI scanner tool provides a breakdown of automated decisions via visual dashboards — an approach it bills as reducing dependency on “specialized AI skills”.

However it is also intending its own professional services staff to work with businesses to use the new software service. So it will be both selling AI, ‘a fix’ for AI’s imperfections, and experts to help smooth any wrinkles when enterprises are trying to fix their AIs… Which suggests that while AI will indeed remove some jobs, automation will be busy creating other types of work.

Nor is IBM the first professional services firm to spot a business opportunity around AI bias. A few months ago Accenture outed a fairness tool for identifying and fixing unfair AIs.

So with a major push towards automation across multiple industries there also looks to be a pretty sizeable scramble to set up and sell services to patch any problems that arise as a result of increasing use of AI.

And, indeed, to encourage more businesses to feel confident about jumping in and automating more. (On that front IBM cites research it conducted which found that while 82% of enterprises are considering AI deployments, 60% fear liability issues and 63% lack the in-house talent to confidently manage the technology.)

In additional to launching its own (paid for) AI auditing tool, IBM says its research division will be open sourcing an AI bias detection and mitigation toolkit — with the aim of encouraging “global collaboration around addressing bias in AI”.

“IBM led the industry in establishing trust and transparency principles for the development of new AI technologies. It’s time to translate principles into practice,” said David Kenny, SVP of cognitive solutions at IBM, commenting in a statement. “We are giving new transparency and control to the businesses who use AI and face the most potential risk from any flawed decision making.”

Sep
13
2018
--

Hacera creates directory to make blockchain projects more searchable

In the 1990s when the web was young, companies like Yahoo, created directories of web pages to help make them more discoverable. Hacera wants to bring that same idea to blockchain, and today it announced the launch of the Hacera Network Registry.

CEO Jonathan Levi says that blockchains being established today risk being isolated because people simply can’t find them. If you have a project like the IBM -Maersk supply chain blockchain announced last month, how does an interested party like a supplier or customs authority find it and ask to participate? Up until the creation of this registry, there was no easy way to search for projects.

Early participants include heavy hitters like Microsoft, Hitachi, Huawei, IBM, SAP and Oracle, who are linking to projects being created on their platforms. The registry supports projects based on major digital ledger communities including Hyperledger, Quorum, Cosmos, Ethereum and Corda. The Hacera Network Registry is built on Hyperledger Fabric, and the code is open source. (Levi was Risk Manager for Hyperledger Fabric 1.0.)

Hacera Network Registry page

While early sponsors of the project include IBM and Hyperledger Fabric, Levi stressed the network is open to all. Blockchain projects can create information pages, not unlike a personal LinkedIn page, and Hacera verifies the data before adding it to the registry. There are currently more than 70 networks in the registry, and Hacera is hoping this is just the beginning.

Jerry Cuomo, VP of blockchain technologies at IBM, says for blockchain to grow it will require a way to register, lookup, join and transact across a variety of blockchain solutions. “As the number of blockchain consortiums, networks and applications continues to grow we need a means to list them and make them known to the world, in order to unleash the power of blockchain,” Cuomo told TechCrunch. Hacera is solving that problem.

This is exactly the kind of underlying infrastructure that the blockchain requires to expand as a technology. Cuomo certainly recognizes this.”We realized from the start that you cannot do blockchain on your own; you need a vibrant community and ecosystem of like-minded innovators who share the vision of helping to transform the way companies conduct business in the global economy,” he said.

Hacera understands that every cloud vendor wants people using their blockchain service. Yet they also see that to move the technology forward, there need to be some standard ways of conducting business, and they want to provide that layer. Levi has a broader vision for the network beyond pure discoverability. He hopes eventually to provide the means to share data through the registry.

Sep
06
2018
--

PagerDuty raises $90M to wake up more engineers in the middle of the night

PagerDuty, the popular service that helps businesses monitor their tech stacks, manage incidents and alert engineers when things go sideways, today announced that it has raised a $90 million Series D round at a valuation of $1.3 billion. With this, PagerDuty, which was founded in 2009, has now raised well over $170 million.

The round was led by T. Rowe Price Associates and Wellington Management . Accel, Andreessen Horowitz and Bessemer Venture Partners participated. Given the leads in this round, chances are that PagerDuty is gearing up for an IPO.

“This capital infusion allows us to continue our investments in innovation that leverages artificial intelligence and machine learning, enabling us to help our customers transform their companies and delight their customers,” said Jennifer Tejada, CEO at PagerDuty in today’s announcement. “From a business standpoint, we can strengthen our investment in and development of our people, our most valuable asset, as we scale our operations globally. We’re well positioned to make the lives of digital workers better by elevating work to the outcomes that matter.”

Currently PagerDuty users include the likes of GE, Capital One, IBM, Spotify and virtually every other software company you’ve ever heard of. In total, more than 10,500 enterprises now use the service. While it’s best known for its alerting capabilities, PagerDuty has expanded well beyond that over the years, though it’s still a core part of its service. Earlier this year, for example, the company announced its new AIOps services that aim to help businesses reduce the amount of noisy and unnecessary alerts. I’m sure there’s a lot of engineers who are quite happy about that (and now sleep better).

Aug
29
2018
--

Google takes a step back from running the Kubernetes development infrastructure

Google today announced that it is providing the Cloud Native Computing Foundation (CNCF) with $9 million in Google Cloud credits to help further its work on the Kubernetes container orchestrator and that it is handing over operational control of the project to the community. These credits will be split over three years and are meant to cover the infrastructure costs of building, testing and distributing the Kubernetes software.

Why does this matter? Until now, Google hosted virtually all the cloud resources that supported the project, like its CI/CD testing infrastructure, container downloads and DNS services on its cloud. But Google is now taking a step back. With the Kubernetes community reaching a state of maturity, Google is transferring all of this to the community.

Between the testing infrastructure and hosting container downloads, the Kubernetes project regularly runs more than 150,000 containers on 5,000 virtual machines, so the cost of running these systems quickly adds up. The Kubernetes container registry has served almost 130 million downloads since the launch of the project.

It’s also worth noting that the CNCF now includes a wide range of members that typically compete with each other. We’re talking Alibaba Cloud, AWS, Microsoft Azure, Google Cloud, IBM Cloud, Oracle, SAP and VMware, for example. All of these profit from the work of the CNCF and the Kubernetes community. Google doesn’t say so outright, but it’s fair to assume that it wanted others to shoulder some of the burdens of running the Kubernetes infrastructure, too. Similarly, some of the members of the community surely didn’t want to be so closely tied to Google’s infrastructure, either.

“By sharing the operational responsibilities for Kubernetes with contributors to the project, we look forward to seeing the new ideas and efficiencies that all Kubernetes contributors bring to the project operations,” Google Kubernetes Engine product manager William Deniss writes in today’s announcement. He also notes that a number of Google’s will still be involved in running the Kubernetes infrastructure.

“Google’s significant financial donation to the Kubernetes community will help ensure that the project’s constant pace of innovation and broad adoption continue unabated,” said Dan Kohn, the executive director of the CNCF. “We’re thrilled to see Google Cloud transfer management of the Kubernetes testing and infrastructure projects into contributors’ hands — making the project not just open source, but openly managed, by an open community.”

It’s unclear whether the project plans to take some of the Google-hosted infrastructure and move it to another cloud, but it could definitely do so — and other cloud providers could step up and offer similar credits, too.

Aug
09
2018
--

IBM teams with Maersk on new blockchain shipping solution

IBM and shipping giant Maersk having been working together for the last year developing a blockchain-based shipping solution called TradeLens. Today they moved the project from Beta into limited availability.

Marie Wieck, GM for IBM Blockchain says the product provides a way to digitize every step of the global trade workflow, transforming it into a real-time communication and visual data sharing tool.

TradeLens was developed jointly by the two companies with IBM providing the underlying blockchain technology and Maersk bringing the worldwide shipping expertise. It involves three components: the blockchain, which provides a mechanism for tracking goods from factory or field to delivery, APIs for others to build new applications on top of the platform these two companies have built, and a set of standards to facilitate data sharing among the different entities in the workflow such as customs, ports and shipping companies.

Wieck says the blockchain really changes how companies have traditionally tracked shipped goods. While many of the entities in the system have digitized the process, the data they have has been trapped in silos and previous attempts at sharing like EDI have been limited. “The challenge is they tend to think of a linear flow and you really only have visibility one [level] up and one down in your value chain,” she said.

The blockchain provides a couple of obvious advantages over previous methods. For starters, she says it’s safer because data is distributed, making it much more secure with digital encryption built in. The greatest advantage though is the visibility it provides. Every participant can check any aspect of the flow in real time, or an auditor or other authority can easily track the entire process from start to finish by clicking on a block in the blockchain instead of requesting data from each entity manually.

While she says it won’t entirely prevent fraud, it does help reduce it by putting more eyeballs onto the process. “If you had fraudulent data at start, blockchain won’t help prevent that. What it does help with is that you have multiple people validating every data set and you get greater visibility when something doesn’t look right,” she said.

As for the APIs, she sees the system becoming a shipping information platform. Developers can build on top of that, taking advantage of the data in the system to build even greater efficiencies. The standards help pull it together and align with APIs, such as providing a standard Bill of Lading. They are starting by incorporating existing industry standards, but are also looking for gaps that slow things down to add new standard approaches that would benefit everyone in the system.

So far, the companies have 94 entities in 300 locations around the world using TradeLens including customs authorities, ports, cargo shippers and logistics companies. They are opening the program to limited availability today with the goal of a full launch by the end of this year.

Wieck ultimately sees TradeLens as a way to facilitate trade by building in trust, the end of goal of any blockchain product. “By virtue of already having an early adopter program, and having coverage of 300 trading locations around the world, it is a very good basis for the global exchange of information. And I personally think visibility creates trust, and that can help in a myriad of ways,” she said.

Jul
31
2018
--

The Istio service mesh hits version 1.0

Istio, the service mesh for microservices from Google, IBM, Lyft, Red Hat and many other players in the open-source community, launched version 1.0 of its tools today.

If you’re not into service meshes, that’s understandable. Few people are. But Istio is probably one of the most important new open-source projects out there right now. It sits at the intersection of a number of industry trends, like containers, microservices and serverless computing, and makes it easier for enterprises to embrace them. Istio now has more than 200 contributors and the code has seen more than 4,000 check-ins since the launch of  version 0.1.

Istio, at its core, handles the routing, load balancing, flow control and security needs of microservices. It sits on top of existing distributed applications and basically helps them talk to each other securely, while also providing logging, telemetry and the necessary policies that keep things under control (and secure). It also features support for canary releases, which allow developers to test updates with a few users before launching them to a wider audience, something that Google and other webscale companies have long done internally.

“In the area of microservices, things are moving so quickly,” Google product manager Jennifer Lin told me. “And with the success of Kubernetes and the abstraction around container orchestration, Istio was formed as an open-source project to really take the next step in terms of a substrate for microservice development as well as a path for VM-based workloads to move into more of a service management layer. So it’s really focused around the right level of abstractions for services and creating a consistent environment for managing that.”

Even before the 1.0 release, a number of companies already adopted Istio in production, including the likes of eBay and Auto Trader UK. Lin argues that this is a sign that Istio solves a problem that a lot of businesses are facing today as they adopt microservices. “A number of more sophisticated customers tried to build their own service management layer and while we hadn’t yet declared 1.0, we hard a number of customers — including a surprising number of large enterprise customer — say, ‘you know, even though you’re not 1.0, I’m very comfortable putting this in production because what I’m comparing it to is much more raw.’”

IBM Fellow and VP of Cloud Jason McGee agrees with this and notes that “our mission since Istio’s launch has been to enable everyone to succeed with microservices, especially in the enterprise. This is why we’ve focused the community around improving security and scale, and heavily leaned our contributions on what we’ve learned from building agile cloud architectures for companies of all sizes.”

A lot of the large cloud players now support Istio directly, too. IBM supports it on top of its Kubernetes Service, for example, and Google even announced a managed Istio service for its Google Cloud users, as well as some additional open-source tooling for serverless applications built on top of Kubernetes and Istio.

Two names missing from today’s party are Microsoft and Amazon. I think that’ll change over time, though, assuming the project keeps its momentum.

Istio also isn’t part of any major open-source foundation yet. The Cloud Native Computing Foundation (CNCF), the home of Kubernetes, is backing linkerd, a project that isn’t all that dissimilar from Istio. Once a 1.0 release of these kinds of projects rolls around, the maintainers often start looking for a foundation that can shepherd the development of the project over time. I’m guessing it’s only a matter of time before we hear more about where Istio will land.

May
15
2018
--

Veridium Labs teams with IBM and Stellar on carbon credit blockchain

Veridium Labs has been trying to solve a hard problem about how to trade carbon offset credits in an open market. The trouble is that more complex credits don’t have a simple value like a stock, and there hasn’t been a formula to determine their individual value. That has made accounting for them and selling them on open exchanges difficult or impossible. It’s a problem Veridium believes they can finally solve with tokens and the blockchain.

This week the company announced a partnership with IBM to sell carbon offset tokens on the Stellar blockchain. Each company has a role here with Veridium setting up the structure and determining the value formula. Stellar acts as the digital ledger for the transactions and IBM will handle the nuts and bolts of the trade activity of buying, selling and managing the tokens.

Todd Lemons, CEO and cofounder of Veridium Labs, which is part of a larger environmental company called EnVision Corporation, says that even companies with the best of intentions have struggled with how to account for the complex carbon credits. There are simpler offset credits that are sold on exchanges, but ones that seek to measure the impact of a product through the entire supply chain are much more difficult to determine.  As one example, how does a company making a candy bar source its cocoa and sugar. It’s not always easy to determine through a web of suppliers and sellers.

Moving forward

To partly solve this problem, another Envision company, InfiniteEARTH developed a way to account for them called the Redd+ forest carbon accounting methodology. It is widely accepted to the point that it has been incorporated in the Paris Climate Agreement, but it doesn’t provide a way to turn the credits into what are called fungible assets, that is an easily tradable one. The problem is the value of a given credit shifts according to the overall environmental impact of producing a good and getting it to market. That value can change according to the product.

Jared Klee, blockchain manager for token initiatives at IBM, says that buying and accounting for Redd+ credits on the company balance sheet has been a huge challenge for organizations. “It’s a major pain point. Today Redd+ credits are over the counter assets and there is no central exchange,” he said. That means they are essentially one-off transactions and the company is forced to hold these assets on the books with no easy way to account for their actual value. That often results in a big loss, he says, and companies are looking for ways to comply in a more cost-efficient way.

Putting it together

The three companies — Veridium, IBM and Stellar — have come together to solve this problem by creating a digital token that acts as a layer on top of the carbon credit to give it a value and make it easier to account for. In addition, the tokens can be bought and sold on the blockchain.

The blockchain provides all the usual advantages of a decentralized record keeping system, immutable records and encrypted transactions.

Veridium is working on the underlying formula for token valuation that measures “carbon density per dollar times product group,” Lemons explained. “That can be coded into a token and carried out automatically,” he added. They are working with various world bodies like the United Nations and The World Resource Institute to help figure out the values for each product group.

All of the details are still being worked out as the idea works its way through the various regulatory bodies, but the companies hope to be making the tokens available for sale some time later this year.

Ultimately this is about finding ways to help businesses comply with environmental initiatives and remove some of the complexity inherent in that process today. “We hope the tokens will provide less friction and a much higher adoption rate,” Lemons said.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com