Over 75 people attended the Ripple Labs Tech Talk for understanding consensus—as part of our ongoing initiative to better educate the broader community about Ripple technology.

Approaching the talk from both a technical and broader industry perspective, Chief Cryptographer David Schwartz discussed the role of innovative banks, the early influences of Bitcoin, and provided a technical history and overview of the core processes underpinning the Ripple protocol.

2015 0020 - Consensus Tech Talk - 03-25-2015 - 0049 - 3694

Guests—who hailed from prominent tech firms, international financial institutions, and various universities—arrived early for a lively happy hour at the Ripple Labs headquarters in downtown San Francisco.

2015 0020 - Consensus Tech Talk - 03-25-2015 - 0096 - 3741

This was the second in the Ripple Labs series of Tech Talks.

2015 0020 - Consensus Tech Talk - 03-25-2015 - 0185 - 3830

In a lively Q&A segment, David answered a wide range of questions, from the technical—such as regarding the robustness of consensus—to the broad—including his take on bank innovation.

Stay tuned for a video recording of the event.

Follow Ripple on Twitter

Ripple

2344752466_7857f1e1ae_b

Photo: Christina/Flickr

Since the beginning of recorded history, the process of standardization has set the stage for immense gains in collaboration, productivity, and innovation. Standards allow us to find collective harmony within a society that grows increasingly complex.

Naturally, the first standards were ways of measuring time and space—from the Mayan Calendar to King Henry I of England’s preferred unit of measurement in 1120 AD—the length of his arm—which he instituted as the “ell.”

While early standards often existed in part as a vehicle for increasing the prestige and power of rulers and regulators that created them, they would—as expectations evolved—become a source of individual empowerment. Following the French Revolution, a new system of measurement was promoted as “a way to erase the arbitrary nature of local rule,” writes Andrew Russell, author of the book, Open Standards and the Digital Age: History, Ideology, and Networks. The argument being—How could citizens truly be free, independent, and self-reliant if they weren’t able to make calculations and measurements on their own?

Indeed, it was broad standardization that paved the way for the Industrial Revolution. Interchangeable parts dramatically reduced costs, allowing for easy assembly of new goods, cheap repairs, and most of all, they reduced the time and skill required for workers. Or consider how those manufactured products are then shipped—likely by train. Prior to the standardization of the railroad gauge, cargo traveling between regions would have to be unloaded and moved to new trains because the distance between rails no longer matched the train’s wheels.

 

hydrant

Photo: Flickr

On the other end of the spectrum, the failure to enact proper standards isn’t just inefficient and costly, it can prove disastrous—such as in 1904, when a vicious fire broke out in Baltimore. New York, Philadelphia, and Washington, DC quickly sent support, but found their efforts to be in vain as their fire hoses weren’t compatible with local fire hydrants. The fire would burn for over 30 hours and destroy 2,500 buildings.

While the situation with today’s payment systems isn’t nearly as dangerous, the lack of a universal standard for transacting value is implicitly costly and serves as a persistent bottleneck toward true financial innovation.

In the U.S., the last time there was broad consensus on a new payments standard was with the creation of Automated Clearing House in the 1970s, an electronic system meant to replace paper checks. That system, which still essentially enables all domestic payments has, in four decades, remained relatively unchanged. The primary reason is that achieving consensus for new standards isn’t easy, especially in an industry as far-reaching and as fundamental to the economy as payments, where there are numerous and a wide range of constituents with incentives that don’t always align. So even as the Federal Reserve pushes for real-time payments, affecting actual change remains elusive, as the technology becomes increasingly antiquated.

While payment standards find themselves stuck in time, standards everywhere else have continued to evolve.

The latter half of the 20th century saw the rise of the concept of the open standard. While there’s no set definition for an open standard, there are a few commonly accepted properties, such as its availability to the general public while being unencumbered by patents.

Early manifestations of an open standard were physical, the quintessential embodiment being the creation of the shipping container. Conceptualized by Malcom McLean in the 1950s and later standardized by the U.S. Maritime Administration and the International Standards Organization in the 1960s, the shipping container became a universal standard for moving goods.

As the standard became widely accepted and used, shipping boomed and costs spiraled downward. In other words, the birth of globalization began with a standard. Such is the ubiquity of shipping containers today, they’re used for low-cost housing in the outskirts of Berlin, while serving as a beer garden in trendy parts of San Francisco.

 

3144199355_d478f8c316_b

Photo: Håkan Dahlströ/Flickr

As it turned out, open standards wouldn’t just facilitate transportation of goods, they’d also enable the efficient and cheap sharing of information through the internet.

Before the rise of open standards, it was physically impossible to connect different computers. Even if you could connect them, they each required proprietary information to understand one another. The creation of standards like Ethernet, TCP/IP, and HTML allowed an unprecedented level of interoperability and simplicity when it came to transporting data. “As we know in hindsight, each of these open standards created an explosion of innovation,” tech luminary Joi Ito wrote in 2009.

And Internet standards are still evolving—from Creative Commons for copyrighted material to OAuth for online authorization.

While open standards have liberated the movement of physical goods and digital information, moving dollars and cents has been disappointingly left behind. It’s one of the primary reasons that there are still 2.5 billion people who lack access to the global economy.

In many cases, serving the unserved starts with setting a standard. One place where that idea has taken hold is Peru, which has one of the lowest rates of inclusion in all of South America—8 out of 10 working adults don’t have access to proper financial services.

When the country initially investigated how to provide more people access, they assumed the problem was mostly technological. They soon discovered that technology was only a small piece of the pie and in order for financial inclusion efforts to truly move forward, regulators would have to create a clear regulatory framework that standardized new technologies while promoting innovation and competition.

In 2013, the Peruvian government did just that, enacting e-money legislation that would blaze a path for serving those living in poverty. It wasn’t long before major financial institutions were onboard. Today, Peru serves as an international model for taking on inclusion.

The U.S. appears to be following suit.  A recent report from the Federal Reserve highlighted four paths to modernizing the U.S. payment system. Tellingly, “option 2” of the report details the development of  “protocols and standards for sending and receiving payments.”

That the U.S. central bank has acknowledged the potential for a new payments standard is momentous. Intelligently crafted standards create the potential for a common language, a universal platform where innovation and economics can flourish.

Follow Ripple on Twitter

Ripple

We’re proud to announce the first release of our new Gateway Guide, a comprehensive manual to operating a gateway in the Ripple network. Whether you’re trying to understand how a gateway makes revenue, or how to use the authorized accounts feature, or even just what a warm wallet is, the gateway guide has you covered.

The guide comes with step-by-step, diagrammed explanations of typical gateway operations, a hefty list of precautions to make your gateway safer, and concrete examples of all the API calls you need to perform in order to get your gateway accounts set up and secure.

We’re proud of all the work we’ve done to make the business of running a gateway easier, but there’s still more work to do. If you have any questions, comments, or ideas, please send feedback to – or post it on our forums. We’d love to hear from you!

Ripple

Ripple users can make payments to Bitcoin addresses directly from Ripple Trade.

Here’s how it works:

It’s that simple.

We call it a “Bitcoin Bridge” and it gives Ripple users access to the entire Bitcoin economy. Please note that payments through the Bitcoin Bridge may take a significant amount of time to be processed.

What Is The Bitcoin Bridge?

Ripple users can make a payment in any currency — dollars, euros, etc. — and the Bitcoin merchant will receive the payment in Bitcoins.

That’s good for Ripple users and it’s also good for Bitcoin merchants.

Thanks to the Bitcoin Bridge, all Bitcoin merchants now accept payments from Ripple users. More than 8,500 merchants are now available on the Ripple network:

And thousands more…

How Does It Work?

The Bitcoin Bridge is a simple protocol that connects the Ripple and Bitcoin networks. When you send money from Ripple to a Bitcoin address, an organization running the bridge protocol facilitates the transactions.

The Bitcoin Bridge is operated by SnapSwap. Please reach out to SnapSwap for Bitcoin transaction details.

Currency Inclusive

Ripple was designed to be currency inclusive. Instead of promoting one currency or another, Ripple makes it easy to send payments in any currency and acts as a “universal translator” for money. The Bitcoin Bridge is one more step in Ripple’s mission to connect the disparate Internet payment systems into a single, shared, unified payment network, freely available and accessible to all.

To learn more, read the official press release.

To use Ripple in your business, contact us at .

 

Ripple

So I’m not sure if this kind of development methodology has ever been applied to such an extreme before so I figured I’d document it. In a nutshell, it’s sort of like test-driven triplet-programming development.

While speed-developing our alpha codebase, four of us sat around a table in the office in Berlin. Three people (Vitalik, Jeff and me) each coders of their own clean-room implementation of the Ethereum protocol. The fourth was Christoph, our master of testing.

Our target was to have three fully compatible implementations as well as an unambiguous specification by the end of three days of substantial development. Over distance, this process normally takes a few weeks.

This time we needed to expedite it; our process was quite simple. First we discuss the various consensus-breaking changes and formally describe them as best we can. Then, individually we each crack on coding up the changes simultaneously, popping our heads up about possible clarifications to the specifications as needed. Meanwhile, Christoph devises and codes tests, populating the results either manually or with the farthest-ahead of the implementations (C++, generally :-P).

After a milestone’s worth of changes are coded up and the tests written, each clean-room implementation is tested against the common test data that Christoph compiled. Where issues are found, we debug in a group. So far, this has proved to be an effective way of producing well-tested code quickly, and perhaps more importantly, in delivering clear unambiguous formal specifications.

Are there any more examples of such techniques taken to the extreme?

The post The Ethereum Development Process appeared first on .

 

Screen Shot 2015-03-04 at 4.00.21 PM

Ripple Labs is thrilled to join the International Payments Framework Association (IPFA), which provides rules sets, best practices, and guidelines to improve cross-border payments.

Composed of over 25 prominent members in the payments space—including the likes of ACH, NACHA, and SWIFT—the IPFA promotes a grand vision for creating a global payments framework that facilitates interoperability and efficient cross-border payment processing.

IPFA is one of a series of membership groups and industry associations that Ripple Labs has joined in order to further our vision of transforming payments. Ripple Labs recently joined the Center for Financial Services Innovation Network and became a member of the NACHA Payment Innovation Alliance in June.

“IPFA rules—when they are appropriately modified for Ripple—helps us create a complete, real-time, cross border payment system,” said Nilesh Dusane, director of business development at Ripple Labs.

“We’re very excited to join this network,” he said.

Follow Ripple on Twitter

Ripple

I’m Vinay Gupta, the newly minted release coordinator for Ethereum. I’ve been working with the comms team on strategy, and have now come aboard to help smooth the release process.

I’ll be about 50/50 on comms and on release coordination. A lot of that is going to be about keeping you updated on progress: new features, new documentation, and hopefully writing about great new services you can use, so it’s in the hinterland between comms and project management. In theory, once I’m up to speed, I should be providing you with the answers to the question: “what’s going on?” But give me some time, because getting up to speed on all of this is nontrivial. We have a very large development team working with very advanced and often quite complex new technology, and keeping everybody up to date on that simultaneously is going to be tricky. To do that well, I have to actually understand what’s going on at quite a technical level first. I have a lot to wrap my head around. I was a 3D graphics programmer through the 1990s, and have a reasonably strong grounding in financial cryptography (I was, and I am not ashamed to admit it, a cypherpunk in those days). But we have a 25-30 person team working in parallel on several different aspects of Ethereum, so… patience please while I master the current state of play, so that I can communicate about what’s changing as we move forwards. It’s a lot of context to acquire, as I’m sure you all know – if there’s an occasional gaffe as I get oriented, forgive me!

I’ve just come back from Switzerland, where I got to meet a lot of the team, my “orientation week” being three days during the release planning meetings. Gav writes in some detail about that week here, so rather than repeat Gav, read his post, and I’ll press on to tell you what was on that release white board.

There is good news, there is bad news, but above all, there is a release schedule.

There will be another blog post with much more detail about the release schedule for the first live Ethereum network shortly – likely by the end of this week, as the developer meeting that Gav mentions in his post winds up and the conclusions are communicated. That’s the post which will give you timelines you can start firing up your mining rigs to, feature lists, and so on. Until then, let me lay out roughly what the four major steps in the release process will look like and we can get into detail soon.

Let’s lay out where we are first: Ethereum is a sprawling project with many teams in many countries implementing the same protocol in several different language versions so it can be integrated into the widest possible range of other systems/ecologies, and to provide long term resilience and future-proofing. In addition to that broad effort, there are several specific applications/toolchains to help people view, build and interact with Ethereum: Mist, Mix, Alethzero and so on. Starting quite soon, and over the next few months, a series of these tools will be stood up as late alpha, beta, ready for general use and shipped. Because the network is valuable, and the network is only as secure as the software we provide, this is going to be a security-led not schedule-led process. You want it done right, we want it done right, and this is one of the most revolutionary software projects ever shipped. 

While you’re waiting for the all singing, all dancing CERN httpd + NCSA Mosaic combo, the “we have just launched the Future of the Internet” breakthrough system, we will be actually be releasing the code and the tools in layers. We are standing up the infrastructure for a whole new web a piece at a time: server first, plus tool chain, and then the full user experience rich client. This makes sense: a client needs something to connect to, so the server infrastructure has to come first. An internet based on this metacomputer model is going to be a very different place, and getting a good interface to that is going to present a whole new set of challenges. There’s no way to simply put all the pieces together and hope it clips into place like forming an arch by throwing bricks in the air: we need scaffolding, and precise fit. We get that by concentrating on the underlying technical aspects for a while, including mining, the underlying network and so on, and then as that is widely deployed, stable and trusted, we will be moving up the stack towards the graphical user interface via Mist in the next few months. None of these pieces stand alone, either: the network needs miners and exchanges, and it takes people time to get organized to do that work properly. The Mist client needs applications, or it’s a bare browser with nothing to connect to, and it takes people time to write those applications. Each change, each step forwards, involves a lot of conversations and support as we get people set up with the new software and help them get their projects off the ground: the whole thing together is an ecology. Each piece needs its own time, its own attention. We have to do this in phases for all of these reasons, and more. 

It took bitcoin, a much less complex project, several years to cover that terrain: we have a larger team, but a more complex project. On the other hand, if you’re following the github repositories, you can see how much progress is being made, week by week, day by day, so… verify for yourself where we are.

So, now we’ve all got on the same page on real world software engineering, let’s actually look at phases of this release process!

Release Step One: Frontier

Frontier takes a model familiar to Bitcoiners, and stands it up for our initial release. Frontier is the Ethereum network in its barest form: an interface to mine Ether, and a way to upload and execute contracts. The main use of Frontier on the launch trajectory is to get mining operations and Ether exchanges running, so the community can get their mining rigs started, and to start to establish a “live” environment where people can test DApps and acquire Ether to upload their own software into Ethereum.

This is “no user interface to speak of” command line country, and you will be expected to be quite expert in the whole Ethereum world model, as well as to have substantial mastery of the tools at your disposal.

However, this is not a test net: this is a frontier release. If you are equipped, come along! Do not die of dysentery on the way.

Frontier showcases three areas of real utility:

  • you can mine real Ether, at 10% of the normal Ether issuance rate, 0.59 Ether per block reward, which can be spent to run programs or exchange for other things, as normal – this real Ether.
  • you can exchange Ether for Bitcoin, or with other users, if you need Ether to run code etc.
  • if you already bought Ether during the crowd sale, and you are fully conversant with the frontier environment, you can use it on the frontier network.
  • we do not recommend this, but have a very substantial security-and-recovery process in place to make it safer – see below 

We will migrate from Frontier to Homestead once Frontier is fully stable in the eyes of the core devs and the auditors:

  • when we are ready to move to Homestead, the release after Frontier, the Frontier network will be shut down; Ether values in wallets will be transferred, but state in contracts is will likely be erased (more information to follow on this in later blog posts)
  • switchover to  the new network will be enforced by “TheBomb”

This is very early release software: feature complete within these boundaries, but with a substantial risk of unexpected behaviours unseen in either the test net or the security review. And it’s not just us that will be putting new code into production: contracts, exchanges, miners, everybody else in the ecosystem will be shipping new services. Any one of those components getting seriously screwed up could impact a lot of users, and we want to shake bugs out of the ecosystem as a whole, not simply our own infrastructure: we are all in this together.

However, to help you safeguard your Ether, we have the following mechanisms planned (more details from the developers will follow soon as the security model is finalised):

  • if you do not perform any transactions, we guarantee 100% your Ether will not be touched and will be waiting for you once we move beyond Frontier
  • if you perform transactions, we guarantee 100% that any Ether you did not spend will will be available to you once we move beyond Frontier not be touched
  • Ether you spend will not fall through cracks into other people’s pockets or vanish without a trace: in the unlikely event that this happens, you have 24 hours to inform us, and we will freeze the network, return to the last good state, and start again with the bug patched
  • yes, this implies a real risk of network instability: everything possible has been done to prevent this, but this is a brand new aeroplane – take your parachute!
  • we will periodically checkpoint the network to show that neither user report nor automated testing has reported any problems. We expect the checkpoints will be around once daily, with a mean of around 12 hours of latency
  • exchanges etc. will be strongly encouraged to wait for checkpoints to be validated before sending out payments in fiat or bitcoin. Ethereum will provide explicit support to aid exchanges in determining what Ether transactions have fully cleared

Over the course of the next few weeks several pieces of software have to be integrated to maintain this basket of security features so we can allow genesis block Ether on to this platform without unacceptable risks. Building that infrastructure is a new process, and while it looks like a safe, sane and conservative schedule, there is always a chance of a delay as the unknown unknown is discovered either by us, the bug bounty hunters or by the security auditors. There will be a post shortly which goes through this release plan in real technical detail, and I’ll have a lot of direct input from the devs on that post, so for now take this with a pinch of salt and we will have hard details and expected dates as soon as possible. 

Release Step Two: Homestead

Homestead is where we move after Frontier. We expect the following three major changes.

  • Ether mining will be at 100% rather than 10% of the usual reward rate
  • checkpointing and manual network halts should never be necessary, although it is likely that checkpointing will continue if there is a general demand for it
  • we will remove the severe risk warning from putting your Ether on the network, although we will not consider the software to be out of beta until Metropolis

Still command line, so much the same feature set as Frontier, but this one we tell you is ready to go, within the relevant parameters.

How long will there be between Frontier and Homestead? Depends entirely on how Frontier performs: best case is not less than a month. We will have a pretty good idea of whether things are going smoothly or not from network review, so we will keep you in the loop through this process.

Release Step Three: Metropolis

Metropolis is when we finally officially release a relatively full-featured user interface for non-technical users of Ethereum, and throw the doors open: Mist launches, and we expect this launch to include a DApp store and several anchor tenant projects with full-featured, well-designed programs to showcase the full power of the network. This is what we are all waiting for, and working towards.

In practice, I suspect there will be at least one, and probably two as-yet-unnamed steps between Homestead and Metropolis: I’m open to suggestions for names (write to vinay[at]ethdev.com). Features will be sensible checkpoints on the way: specific feature sets inside of Mist would be my guess, but I’m still getting my head around that, so I expect we will cross those bridges after Homestead is stood up.

Release Step Four: Serenity

There’s just one thing left to discuss: mining. Proof of Work implies the inefficient conversion of electricity into heat, Ether and network stability, and we would quite like to not warm the atmosphere with our software more than is absolutely necessary. Short of buying carbon offsets for every unit of Ether mined (is that such a bad idea?), we need an algorithmic fix: the infamous Proof of Stake. 

Switching the network from Proof of Work to Proof of Stake is going to require a substantial switch, a transition process potentially much like the one between Frontier and Homestead. Similar rollback measures may be required, although in all probability more sophisticated mechanisms will be deployed (e.g. running both mechanisms together, with Proof of Work dominant, and flagging any cases where Proof of Stake gives a different output.)

This seems a long way out, but it’s not as far away as all that: the work is ongoing.

Proof of Work is a brutal waste of computing power – like democracy*, the worst system except all the others (*voluntarism etc. have yet to be tried at scale). Freed from that constraint, the network should be faster, more efficient, easier for newcomers to get into, and more resistant to cartelization of mining capacity etc. This is probably going to be almost as big a step forwards as putting smart contracts into a block chain in the first place, by the time all is said and done. It is a ways out. It will be worth it. 

Timelines

As you have seen since the Ether Sale, progress has been rapid and stable. Code on the critical path is getting written, teams are effective and efficient, and over-all the organization is getting things done. Reinventing the digital age is not easy, but somebody has to do it. Right now that is us.

We anticipate roughly one major announcement a month for the next few months, and then a delay while Metropolis is prepared. There will also be DEVcon One, an opportunity to come, learn the practical business of building and shipping DApps, meet fellow developers, potential investors, and understand the likely shape of things to come.

We will give you information about each release in more detail as each release approaches, but I want to give you the big overview of how this works and where we are going, fill in some of the gaps, highlight what is changing, both technically and in our communications and business partnership, and present you with an overview of what the summer is going to be like as we move down the path towards Serenity, another world changing technology.

I’m very glad to be part of this process. I’m a little at sea right now trying to wrap my head around the sheer scope of the project, and I’m hoping to actually visit a lot of the development teams over the summer to get the stories and put faces to names. This is a big, diverse project and, beyond the project itself, the launch of a new sociotechnical ecosystem. We are, after all, a platform effort: what’s really going to turn this into magic is you, and the things you build on top of the tools we’re all working so hard to ship. We are making tools for tool-makers.

Vinay signing off for now. More news soon!

 

The post The Ethereum Launch Process appeared first on .

 

I was woken by Vitalik’s call at 5:55 this morning; pitch black outside, nighttime was still upon us. Nonetheless, it was time to leave and this week had best start on the right foot.

The 25-minute walk in darkness from the Zug-based headquarters to the train station was wet. Streetlights reflecting off the puddles on the clean Swiss streets provided a picturesque, if quiet, march into town. I couldn’t help but think the rain running down my face was a very liquid reminder of the impending seasonal change, and then, on consideration, how fast the last nine months had gone.

Solid Foundations

The last week was spent in Zug by the Ethereum foundation board and ÐΞV leadership: Vitalik, Mihai and Taylor who officially form the founation’s board, Anthony and Joseph as the other official advisors and Aeron & Jutta as the ÐΞV executive joined by Jeff and myself wearing multiple hats of ÐΞV and advisory). The chief outcome of this was the dissemination of Vitalik’s superb plan to reform the foundation and turn it into a professional entity. The board will be recruited from accomplished professionals with minimal conflicts of interest; the present set of “founders” officially retired from those positions and a professional executive recruited, the latter process lead by Joseph. Anthony will take a greater ambassadorial role for Ethereum in China and North America. Conversely, ÐΞV will function much more as a department of the Foundation’s executive rather than a largely independent entity. Finally, I presented the release strategy to the others; an event after which I’ve never seen quite so many photos taken of a whiteboard. Needless to say, all was well received by the board and advisors. More information will be coming soon.

As I write this, I’m sitting on a crowded early commuter train, Vinay Gupta in tow, who recently took on a much more substantive role this week as release coordinator. He’ll be helping with release strategy and to keep you informed of our release process. This week, which might rather dramatically be described as ‘pivotal’ in the release process, will see Jeff, Vitalk and me sit around a table and develop all the PoC-9 changes, related unit tests, and integrations in three days, joined by our indomitable Master of Testing, Christoph. The outcome of this week will inform our announcement which will come later this week outlining in clear terms what we will be releasing and when.

I’m sorry it has been so long without an update. The last 2 months has been somewhat busy, choked up with travel and meetings, with the remaining time soaked up by coding, team-leading and management. The team is now substantially formed; the formal security audit started four weeks ago; the bounty programme is running smoothly. The latter processes are the exceedingly capable hands of Jutta and Gustav. Aeron, meanwhile will be stepping down as the ÐΞV head of finance and operations and assuming the role he was initially brought aboard for, system modelling. We’ll hopefully be able to announce his successors next week (yes, that was plural; he has been doing the jobs of 2.5 people over the last few months).

We are also in the process of forming partnerships with third parties in the industry; George, Jutta and myself managing this process; I’m happy to announce that at least three exchanges will be supporting Ether from day one on their trading platforms (details of which we’ll annouce soon), with more exchanges to follow. Marek and Alex are providing technical supprt there with Marek going so far as to make a substantial reference exchange implementation.

I also finished the first draft of ICAP, the Ethereum Inter-exchange Client Address Protocol, an IBAN-compatible system for referencing and transacting to client accounts aimed to streamline the process of transfering funds, worry-free between exchanges and, ultimately, make KYC and AML pains a thing of the past. The IBAN compatibility may even provide possibility of easy integration with existing banking infrastructure in some future.

Developments

Proof-of-Concept releases VII and VIII were released. NatSpec, “natural language specification format” and the basis of our transaction security was prototyped and integrated. Under Marek’s watch, now helped by Fabian, ethereum.js is truly coming of age with a near source-level compatibility with Solidity on contract interaction and support for the typed ABI with calling and events, the latter providing hassle-free state-change reporting. Mix, our IDE, underwent its first release and after some teethng issues is getting good use thanks to the excellent work done by Arkadiy and Yann. Solidity had numerous features added and is swiftly approaching 1.0 status with Christian, Lefteris and Liana to thank. Marian’s work goes ever forward on the network monitoring system while Sven and Heiko have been working diligently on the stress testing infrastructure which analyses and tests the peer network formation and performance. They’ll soon be joined by Alex and Lefteris to accellerate this programme.

So one of the major things that needed sorting for the next release is the proof-of-work algorithm that we’ll use. This had a number of requirements, two of which were actually pulling in opposite directions, but basically it had to be light-client-friendly algorithm whose speed-of-mining is proportional to the IO-bandwidth and which requires a considerable amount of RAM to do so. There was a vague consensus that we (well.. Vitalik and Matthew) head in the direction of a Hasimoto-like algorithm (a proof-of-work designed for the Bitcoin blockchain that aims to be IO-bound, meaning, roughly, that to make it go any faster, you’d need to add more memory rather than just sponsoring a smaller/faster ASIC). Since our blockchain has a number of important differences with the Bitcoin blockchain (mainly in transaction density), stemming from the extremely short 12s block time we’re aiming for, we would have to use not the blockchain data itself like Hashimoto but rather an artifcially created dataset, done with an algorithm known as Dagger (yes, some will remember it as Vitalik’s first and flawed attempt at a memory-hard proof-of-work).

While this looked like a good direction to be going in, a swift audit of Vitalik and Matt’s initial algorithm by Tim Hughes (ex-Director of Technology at Frontier Developments and expert in low-level CPU and GPU operation and optimisation) showed major flaws. With his help, they were able to work together to devise a substantially more watertight algorithm that, we are confident to say, should make the job of developing an FPGA/ASIC sufficiently difficult, especially given our determination to switch to a proof-of-stake system within the next 6-12 months.

Last, but not least, the new website was launched. Kudos to Ian and Konstantin for mucking down and getting it done. Next stop will be the developer site, which will be loosely based on the excellent resource at qt.io, the aim to provide a one-stop extravaganza of up to date reference documentation, curated tutorials, examples, recipes, downloads, issue tracking, and build status.

Onwards

So, as Alex, our networking maestro might say, these are exciting times. When deep in nitty gritty of development you sometimes forget quite how world-altering the technology you’re creating is, which is probably just as well since the gravity of the matter at hand would be continually distracting. Nonetheless, when one starts considering the near-term alterations that we can really bring one realises that the wave of change is at once unavoidable and heading straight for you. For what it’s worth, I find an excellent accompaniment to this crazy life is the superb music of Pretty Lights.

The post Gav’s Ethereum ÐΞV Update V appeared first on .