Menu

Timesdelhi.com

March 25, 2019
Category archive

linux

Suse is once again an independent company

in Attachmate/coo/Delhi/Enterprise/eqt/India/linux/Micro Focus/Novell/opensuse/Politics/suse by

Open-source infrastructure and application delivery vendor Suse — the company behind one of the oldest Linux distributions — today announced that it is once again an independent company. The company today finalized its $2.5 billion acquisition by growth investor EQT from Micro Focus, which itself had acquired it back in 2014.

Few companies have changed hands as often as Suse and yet remained strong players in their business. Suse was first acquired by Novell in 2004. Novell was then acquired by Attachmate in 2010, which Micro Focus acquired in 2014. The company then turned Suse into an independent division, only to then announce its sale to EQT in the middle of 2018.

It took a while for Micro Focus and EQT to finalize the acquisition, though, but now, for the first time since 2004, Suse stands on its own.

Micro Focus says that when it acquired Attachmate Group for $2.35 billion, Suse generated just 20 percent of the group’s total revenues. Since then, Suse has generated quite a bit more business as it expanded its product portfolio well beyond its core Linux offerings and into the more lucrative open-source infrastructure and application delivery business by, among other things, offering products and support around massive open-source projects like Cloud Foundry, OpenStack and Kubernetes.

Suse CEO Nils Brauckmann will remain at the helm of the company, but the company is shaking up its executive ranks a bit. Enrica Angelone, for example, has been named to the new post of CFO at Suse, and Sander Huyts is now the company’s COO. Former Suse CTO Thomas Di Giacomo is now president of Engineering, Product and Innovation. All three report directly to Brauckmann.

SUSE buys HPE’s OpenStack and Cloud Foundry assets

“Our genuinely open, open source solutions, flexible business practices, lack of enforced vendor lock-in and exceptional service are more critical to customer and partner organizations, and our independence coincides with our single-minded focus on delivering what is best for them,” said Brauckmann in today’s announcement. “Our ability to consistently meet these market demands creates a cycle of success, momentum and growth that allows SUSE to continue to deliver the innovation customers need to achieve their digital transformation goals and realize the hybrid and multi-cloud workload management they require to power their own continuous innovation, competitiveness and growth.”

Since IBM recently bought Red Hat for $34 billion, though, it remains to be seen how long Suse’s independent future will last. The market for open source is only heating up, after all.

News Source = techcrunch.com

The next integration evolution — blockchain

in blockchain/Column/Delhi/file-sharing/India/linux/microservices/Politics/SOA/system integration/TC/web services/xml by
Here is one way to look at distributed ledger technologies (DLT) and blockchain in the context of integration evolution. Over the years, businesses and their systems are getting more integrated, forming industry-specific trustless networks, and blockchain technology is in the foundation of this evolutionary step.

Enterprise integration

Large organizations have a large number of applications running in separate silos that need to share data and functionality in order to operate in a unified and consistent way. The process of linking such applications within a single organization, to enable sharing of data and business processes, is called enterprise application integration (EAI).

Similarly, organizations also need to share data and functionality in a controlled way among themselves. They need to integrate and automate the key business processes that extend outside the walls of the organizations. The latter is an extension of EAI and achieved by exchanging structured messages using agreed upon message standards referred to as business-to-business (B2B) integration.

Fundamentally, both terms refer to the process of integrating data and functionality that spans across multiple systems and sometimes parties. The systems and business processes in these organizations are evolving, and so is the technology enabling B2B unification.

Evolution of integration

There isn’t a year when certain integration technologies became mainstream; they gradually evolved and built on top of each other. Rather than focusing on the specific technology and year, let’s try to observe the progression that happened over the decades and see why blockchain is the next technology iteration.

Evolution of integration technologies

Next we will explore briefly the main technological advances in each evolutionary step listed in the table above.

Data integration

This is one of the oldest mechanisms for information access across different systems with the following two primary examples:

  • Common database approach is used for system integration within organizations.
  • File sharing method is used for within and cross-organization data exchange. With universal protocols such as FTP, file sharing allows exchange of application data running across machines and operating systems.

But both approaches are non-real-time, batch-based integrations with limitations around scalability and reliability.

Functionality integration

While data integration provided non-real-time data exchange, the methods described here allow real-time data and importantly functionality exchange:

  • Remote procedure call provides significant improvements over low-level socket-based integration by hiding networking and data marshaling complexity. But it is an early generation, language-dependent, point-to-point, client-server architecture.
  • Object request broker architecture (with CORBA, DCOM, RMI implementations) introduces the broker component, which allows multiple applications in different languages to reuse the same infrastructure and talk to each other in a peer-to-peer fashion. In addition, the CORBA model has the notion of naming, security, concurrency, transactionality, registry and language-independent interface definition.
  • Messaging introduces temporal decoupling between applications and ensures guaranteed asynchronous message delivery.

So far we have seen many technology improvements, but they are primarily focused on system integration rather than application integration aspects. From batch to real-time data exchange, from point-to-point to peer-to-peer, from synchronous to asynchronous, these methods do not care or control what is the type of data they exchange, nor force or validate it. Still, this early generation integration infrastructure enabled B2B integrations by exchanging EDI-formatted data for example, but without any understanding of the data, nor the business process, it is part of.

With CORBA, we have early attempts of interface definitions, and services that are useful for application integration.

Service-oriented architecture

The main aspects of SOA that are relevant for our purpose are Web Services standards. XML providing language-independent format for exchange of data, SOAP providing common message format and WSDL providing an independent format for describing service interfaces, form the foundation of web services. These standards, combined with ESB and BPM implementations, made integrations focus on the business integration semantics, whereas the prior technologies were enabling system integration primarily.

Web services allowed systems not to exchange data blindly, but to have machine readable contracts and interface definitions. Such contracts would allow a system to understand and validate the data (up to a degree) before interacting with the other system.

I also include microservices architectural style here, as in its core, it builds and improves over SOA and ESBs. The primary evolution during this phase is around distributed system decomposition and transition from WS to REST-based interaction.

In summary, this is the phase where, on top of common protocols, distributed systems also got common standards and contracts definitions.

Blockchain-based integration

While exchanging data over common protocols and standards helps, the service contracts do not provide insight about the business processes hidden behind the contracts and running on remote systems. A request might be valid according to the contract, but invalid depending on the business processes’ current state. That is even more problematic when integration is not between two parties, as in the client-server model, but among multiple equally involved parties in a peer-to-peer model.

Sometimes multiple parties are part of the same business process, which is owned by no one party but all parties. A prerequisite for a proper functioning of such a multi-party interaction is transparency of the common business process and its current state. All that makes the blockchain technology very attractive for implementing distributed business processes among multiple parties.

This model extends the use of shared protocols and service contracts with shared business processes and contained state. With blockchain, all participating entities share the same business process in the form of smart contracts. But in order to validate the requests, process and come to the same conclusion, the business processes need also the same state, and that is achieved through the distributed ledger. Sharing all the past states of a smart contract is not a goal by itself, but a prerequisite of the shared business process runtime.

Looked at from this angle, blockchain can be viewed as the next step in the integration evolution. As we will see below, blockchain networks act as a kind of distributed ESB and BPM machinery that are not contained within a single business entity, but spanning multiple organizations.

Integration technology moving into the space between systems

First the protocols (such as FTP), then the API contracts (WSDL, SOAP) and now the business processes themselves (smart contracts) and their data are moving outside of the organizations, into the common shared space, and become part of the integration infrastructure. In some respect, this trend is analogous to how cross-cutting responsibilities of microservices are moving from within services into the supporting platforms.

With blockchain, common data models and now business processes are moving out of the organizations into the shared business networks. Something to note is that this move is not universally applicable and it is not likely to become a mainstream integration mechanism. Such a move is only possible when all participants in the network have the same understanding of data models and business processes; hence, it is applicable only in certain industries where the processes can be standardized, such as finance, supply chain, health care, etc.

Generations of integrations

Having done some chronological technology progression follow-up, let’s have a more broad look at the B2B integration evolution and its main stages.

First generation: system integration protocols

This is the generation of integration technology before CORBA and SOA, enabling mainly data exchange over common protocols but without an understanding of the data, contracts and business processes:

  • Integration model: client-server, where the server component is controlled by one party only; examples are databases, file servers, message brokers, etc.
  • Explicit, shared infrastructure: low-level system protocols and APIs such as FTP.
  • Implicit, not shared infrastructure: application contracts, data formats, business processes not part of the common integration infrastructure.

Second generation: application integration contracts

This generation of integration technology uses the system protocols from previous years and allows applications to share their APIs in the form of universal contracts. This is the next level of integration, where both applications understand the data, its structure, possible error conditions, but not the business process and current state behind it in the other systems:

  • Integration model: client-server model with APIs described by contracts.
  • Explicit, shared infrastructure: protocols, application contracts, and API definitions.
  • Implicit, not shared infrastructure: business processes and remote state are still private.

Third generation: distributed business processes

The blockchain-based generation, which still has to prove itself as a viable enterprise architecture, goes a step further. It uses peer-to-peer protocols, and shares business processes with state across multiple systems that are controlled by parties not trusting each other. While previous integration generations required shared understanding of protocol or APIs, this relies on common understanding of the full business process and its current state. Only then it makes sense and pays off to form a cross-organization distributed business process network:

  • Integration model: multi-party, peer-to-peer integration, by forming business networks with distributed business processes.
  • Explicit, shared infrastructure: business process and its required state.
  • Implicit, not shared infrastructure: other non-process related state.

There are many blockchain-based projects that are taking different approaches for solving the business integration challenges. In no particular, order here are some of the most popular and interesting permissioned open-source blockchain projects targeting the B2B integration space:

  • Hyperledger Fabric is one of the most popular and advanced blockchain frameworks, initially developed by IBM, and now part of Linux Foundation.
  • Hyperledger Sawtooth is another Linux Foundation distributed project developed initially by Intel. It is popular for its modularity and full component replaceability.
  • Quorum is an enterprise-focused distribution of Ethereum.
  • Corda is another project that builds on top of existing JVM-based middleware technologies and enables organizations to transact with contracts and exchange value.

There are already many business networks built with the above projects, enabling network member organizations to integrate and interact with each other using this new integration model.

In addition to these full-stack blockchain projects that provide network nodes, there also are hybrid approaches. For example, Unibright is a project that aims to connect internal business processes defined in familiar standards such as BPMN with existing blockchain networks by automatically generating smart contracts. The smart contracts can be generated for public or private blockchains, which can act as another integration pillar among organizations.

Recently, there are many blockchain experiments in many fields of life. While public blockchains generate all the hype by promising to change the world, private and permissioned blockchains are promising less, but are advancing steadily.

Conclusion

Enterprise integration has multiple nuances. Integration challenges within an organization, where all systems are controlled by one entity and participants have some degree of trust to each other, are mostly addressed by modern ESBs, BPMs and microservices architectures. But when it comes to multi-party B2B integration, there are additional challenges. These systems are controlled by multiple organizations, have no visibility of the business processes and do not trust each other. In these scenarios, we see organizations experimenting with a new breed of blockchain-based technology that relies not only on sharing of the protocols and contracts but sharing of the end-to-end business processes and state.

And this trend is aligned with the general direction integration has been evolving over the years: from sharing the very minimum protocols, to sharing and exposing more and more in the form of contracts, APIs and now business processes.

This shared integration infrastructure enables new transparent integration models where the previously private business processes are now jointly owned, agreed, built, maintained and standardized using the open-source collaboration model. This can motivate organizations to share business processes and form networks to benefit further from joint innovation, standardization and deeper integration in general.

News Source = techcrunch.com

How open source software took over the world

in apache/author/cloud computing/Cloudera/cockroach labs/Column/computing/Databricks/Delhi/designer/executive/free software/Getty/GitHub/HashiCorp/hortonworks/IBM/India/linus torvalds/linux/Microsoft/microsoft windows/mongo/MongoDB/mulesoft/mysql/open source software/operating system/operating systems/oracle/Politics/red hat/RedHat/sap/Software/software as a service/TC/Yahoo by

It was just 5 years ago that there was an ample dose of skepticism from investors about the viability of open source as a business model. The common thesis was that Redhat was a snowflake and that no other open source company would be significant in the software universe.

Fast forward to today and we’ve witnessed the growing excitement in the space: Redhat is being acquired by IBM for $32 billion (3x times its market cap from 2014); Mulesoft was acquired after going public for $6.5 billion; MongoDB is now worth north of $4 billion; Elastic’s IPO now values the company at $6 billion; and, through the merger of Cloudera and Hortonworks, a new company with a market cap north of $4 billion will emerge. In addition, there’s a growing cohort of impressive OSS companies working their way through the growth stages of their evolution: Confluent, HashiCorp, DataBricks, Kong, Cockroach Labs and many others. Given the relative multiples that Wall Street and private investors are assigning to these open source companies, it seems pretty clear that something special is happening.

So, why did this movement that once represented the bleeding edge of software become the hot place to be? There are a number of fundamental changes that have advanced open source businesses and their prospects in the market.

David Paul Morris/Bloomberg via Getty Images

From Open Source to Open Core to SaaS

The original open source projects were not really businesses, they were revolutions against the unfair profits that closed-source software companies were reaping. Microsoft, Oracle, SAP and others were extracting monopoly-like “rents” for software, which the top developers of the time didn’t believe was world class. So, beginning with the most broadly used components of software – operating systems and databases – progressive developers collaborated, often asynchronously, to author great pieces of software. Everyone could not only see the software in the open, but through a loosely-knit governance model, they added, improved and enhanced it.

The software was originally created by and for developers, which meant that at first it wasn’t the most user-friendly. But it was performant, robust and flexible. These merits gradually percolated across the software world and, over a decade, Linux became the second most popular OS for servers (next to Windows); MySQL mirrored that feat by eating away at Oracle’s dominance.

The first entrepreneurial ventures attempted to capitalize on this adoption by offering “enterprise-grade” support subscriptions for these software distributions. Redhat emerged the winner in the Linux race and MySQL (thecompany) for databases. These businesses had some obvious limitations – it was harder to monetize software with just support services, but the market size for OS’s and databases was so large that, in spite of more challenged business models, sizeable companies could be built.

The successful adoption of Linux and MySQL laid the foundation for the second generation of Open Source companies – the poster children of this generation were Cloudera and Hortonworks. These open source projects and businesses were fundamentally different from the first generation on two dimensions. First, the software was principally developed within an existing company and not by a broad, unaffiliated community (in the case of Hadoop, the software took shape within Yahoo!) . Second, these businesses were based on the model that only parts of software in the project were licensed for free, so they could charge customers for use of some of the software under a commercial license. The commercial aspects were specifically built for enterprise production use and thus easier to monetize. These companies, therefore, had the ability to capture more revenue even if the market for their product didn’t have quite as much appeal as operating systems and databases.

However, there were downsides to this second generation model of open source business. The first was that no company singularly held ‘moral authority’ over the software – and therefore the contenders competed for profits by offering increasing parts of their software for free. Second, these companies often balkanized the evolution of the software in an attempt to differentiate themselves. To make matters more difficult, these businesses were not built with a cloud service in mind. Therefore, cloud providers were able to use the open source software to create SaaS businesses of the same software base. Amazon’s EMR is a great example of this.

The latest evolution came when entrepreneurial developers grasped the business model challenges existent in the first two generations – Gen 1 and Gen 2 – of open source companies, and evolved the projects with two important elements. The first is that the open source software is now developed largely within the confines of businesses. Often, more than 90% of the lines of code in these projects are written by the employees of the company that commercialized the software. Second, these businesses offer their own software as a cloud service from very early on. In a sense, these are Open Core / Cloud service hybrid businesses with multiple pathways to monetize their product. By offering the products as SaaS, these businesses can interweave open source software with commercial software so customers no longer have to worry about which license they should be taking. Companies like Elastic, Mongo, and Confluent with services like Elastic Cloud, Confluent Cloud, and MongoDB Atlas are examples of this Gen 3.  The implications of this evolution are that open source software companies now have the opportunity to become the dominant business model for software infrastructure.

The Role of the Community

While the products of these Gen 3 companies are definitely more tightly controlled by the host companies, the open source community still plays a pivotal role in the creation and development of the open source projects. For one, the community still discovers the most innovative and relevant projects. They star the projects on Github, download the software in order to try it, and evangelize what they perceive to be the better project so that others can benefit from great software. Much like how a good blog post or a tweet spreads virally, great open source software leverages network effects. It is the community that is the source of promotion for that virality.

The community also ends up effectively being the “product manager” for these projects. It asks for enhancements and improvements; it points out the shortcomings of the software. The feature requests are not in a product requirements document, but on Github, comments threads and Hacker News. And, if an open source project diligently responds to the community, it will shape itself to the features and capabilities that developers want.

The community also acts as the QA department for open source software. It will identify bugs and shortcomings in the software; test 0.x versions diligently; and give the companies feedback on what is working or what is not.  The community will also reward great software with positive feedback, which will encourage broader use.

What has changed though, is that the community is not as involved as it used to be in the actual coding of the software projects. While that is a drawback relative to Gen 1 and Gen 2 companies, it is also one of the inevitable realities of the evolving business model.

Linus Torvalds was the designer of the open-source operating system Linux.

Rise of the Developer

It is also important to realize the increasing importance of the developer for these open source projects. The traditional go-to-market model of closed source software targeted IT as the purchasing center of software. While IT still plays a role, the real customers of open source are the developers who often discover the software, and then download and integrate it into the prototype versions of the projects that they are working on. Once “infected”by open source software, these projects work their way through the development cycles of organizations from design, to prototyping, to development, to integration and testing, to staging, and finally to production. By the time the open source software gets to production it is rarely, if ever, displaced. Fundamentally, the software is never “sold”; it is adopted by the developers who appreciate the software more because they can see it and use it themselves rather than being subject to it based on executive decisions.

In other words, open source software permeates itself through the true experts, and makes the selection process much more grassroots than it has ever been historically. The developers basically vote with their feet. This is in stark contrast to how software has traditionally been sold.

Virtues of the Open Source Business Model

The resulting business model of an open source company looks quite different than a traditional software business. First of all, the revenue line is different. Side-by-side, a closed source software company will generally be able to charge more per unit than an open source company. Even today, customers do have some level of resistance to paying a high price per unit for software that is theoretically “free.” But, even though open source software is lower cost per unit, it makes up the total market size by leveraging the elasticity in the market. When something is cheaper, more people buy it. That’s why open source companies have such massive and rapid adoption when they achieve product-market fit.

Another great advantage of open source companies is their far more efficient and viral go-to-market motion. The first and most obvious benefit is that a user is already a “customer” before she even pays for it. Because so much of the initial adoption of open source software comes from developers organically downloading and using the software, the companies themselves can often bypass both the marketing pitch and the proof-of-concept stage of the sales cycle. The sales pitch is more along the lines of, “you already use 500 instances of our software in your environment, wouldn’t you like to upgrade to the enterprise edition and get these additional features?”  This translates to much shorter sales cycles, the need for far fewer sales engineers per account executive, and much quicker payback periods of the cost of selling. In fact, in an ideal situation, open source companies can operate with favorable Account Executives to Systems Engineer ratios and can go from sales qualified lead (SQL) to closed sales within one quarter.

This virality allows for open source software businesses to be far more efficient than traditional software businesses from a cash consumption basis. Some of the best open source companies have been able to grow their business at triple-digit growth rates well into their life while  maintaining moderate of burn rates of cash. This is hard to imagine in a traditional software company. Needless to say, less cash consumption equals less dilution for the founders.

Photo courtesy of Getty Images

Open Source to Freemium

One last aspect of the changing open source business that is worth elaborating on is the gradual movement from true open source to community-assisted freemium. As mentioned above, the early open source projects leveraged the community as key contributors to the software base. In addition, even for slight elements of commercially-licensed software, there was significant pushback from the community. These days the community and the customer base are much more knowledgeable about the open source business model, and there is an appreciation for the fact that open source companies deserve to have a “paywall” so that they can continue to build and innovate.

In fact, from a customer perspective the two value propositions of open source software are that you a) read the code; b) treat it as freemium. The notion of freemium is that you can basically use it for free until it’s deployed in production or in some degree of scale. Companies like Elastic and Cockroach Labs have gone as far as actually open sourcing all their software but applying a commercial license to parts of the software base. The rationale being that real enterprise customers would pay whether the software is open or closed, and they are more incentivized to use commercial software if they can actually read the code. Indeed, there is a risk that someone could read the code, modify it slightly, and fork the distribution. But in developed economies – where much of the rents exist anyway, it’s unlikely that enterprise companies will elect the copycat as a supplier.

A key enabler to this movement has been the more modern software licenses that companies have either originally embraced or migrated to over time. Mongo’s new license, as well as those of Elastic and Cockroach are good examples of these. Unlike the Apache incubated license – which was often the starting point for open source projects a decade ago, these licenses are far more business-friendly and most model open source businesses are adopting them.

The Future

When we originally penned this article on open source four years ago, we aspirationally hoped that we would see the birth of iconic open source companies. At a time where there was only one model – Redhat – we believed that there would be many more. Today, we see a healthy cohort of open source businesses, which is quite exciting. I believe we are just scratching the surface of the kind of iconic companies that we will see emerge from the open source gene pool. From one perspective, these companies valued in the billions are a testament to the power of the model. What is clear is that open source is no longer a fringe approach to software. When top companies around the world are polled, few of them intend to have their core software systems be anything but open source. And if the Fortune 5000 migrate their spend on closed source software to open source, we will see the emergence of a whole new landscape of software companies, with the leaders of this new cohort valued in the tens of billions of dollars.

Clearly, that day is not tomorrow. These open source companies will need to grow and mature and develop their products and organization in the coming decade. But the trend is undeniable and here at Index we’re honored to have been here for the early days of this journey.

News Source = techcrunch.com

AWS launches Arm-based servers for EC2

in Amazon Web Services/amd/ARM/AWS/AWS re:Invent 2018/Cloud/cloud computing/Delhi/Developer/India/linux/operating system/operating systems/Politics/TC/Ubuntu/web servers by

At its re:Invent conference in Las Vegas, AWS today announced the launch of Arm-based servers for its EC2 cloud computing service. These aren’t run-of-the-mill Arm chips, though. AWS took the standard Arm cores and then customized them to fit its needs.The company says that its so-called AWS Graviton Processors have been optimized for performance and cost, with a focus on scale-out workloads that can be spread across a number of smaller instances (think containerized microservices, web servers, caching fleets, etc.).

The first set of instances, called A1, is now available in a number of AWS regions in the U.S. and Europe. They support all of AWS’s standard instance pricing models, including on-demand, reserved instance, spot instance, dedicated instance and dedicated host.

For now, you can only use Amazon Linux 2, RHEL and Ubuntu as operating systems for these machines, but AWS promises that additional operating system support will launch in the future.

Because these are ARM servers, you’ll obviously have to recompile any native code for them before you can run your applications on them. Virtually any application that is written in a scripting language, though, will probably run without any modifications.

Prices for these instances start at $0.0255/hour for an a1.medium machine with 1 CPU and 2 GiB of RAM and go up to $0.4080/hour for machines with 16 CPUs and 32 GiB of RAM. That’s maybe not as cheap as you would’ve expected given that an X86-based t3.nano server starts at $0.0052/hour, but you can always save quite a bit by using spot instances, of course. Until we see some benchmarks, though, it’s hard to compare these different machine types anyway.

As Amazon’s Jeff Barr notes in today’s announcement, the company’s move to its so-called Nitro System now allows it to launch new instance types at a faster clip. Nitro essentially provides the building blocks for creating new instance types that the team can then mix and match as needed.

It’s worth noting that AWS also launched support for AMD EPYC processors earlier this month.

more AWS re:Invent 2018 coverage

News Source = techcrunch.com

WTF is happening to crypto?

in Binance/Bitcoin/Bitmain/ceo/coinbase/cryptocurrencies/cryptocurrency/currency/Delhi/Economy/ethereum/founder/India/jon evans/joseph lubin/linux/Politics/TC/texas/U.S. Securities and Exchange Commission/United States/university of texas/us government by

Four days ago the crypto markets were crashing hard. Now they’re crashing harder. Bitcoin, which hasn’t fallen past $6,000 for months, has dumped to $4,413.99 as of this morning, and nearly everything else is falling in unison. Ethereum, flying high at $700 a few months ago, is at $140. Coinbase, that bastion of crypto stability, is currently sporting a series of charts that look like Aspen black-diamond ski runs.

What is happening? There are a number of theories, and I’ll lay out a few of them here. Ultimately, sentiment is bleak in the crypto world, with bull runs being seen as a thing of a distant past. As regulators clamp down, pie-in-the-sky ideas crash and shady dealers take their shady dealings elsewhere, the things that made cryptocurrencies so much fun — and so dangerous — are slowly draining away. What’s left is anyone’s guess, but at least it will make things less interesting.

The bag holder theory

November was supposed to be a good month for crypto. Garbage sites like FortuneJack were crowing about bitcoin stability while the old crypto hands were optimistic and pessimistic at the same time. Eric Vorhees, founder of ShapeShift, felt that the inevitable collapse of the global financial system is good for folks with at least a few BTC in their wallets.

Others, like the Binance CEO Changpeng Zhao, are expecting a bull run next year and said his company was particularly profitable.

Ultimately, crypto hype moves the market far more than it has any right to, and this is a huge problem.

So who do you believe, these guys or your own lying eyes? That’s a complex question. First, understand that crypto is a technical product weaponized by cash. Companies like Binance and Coinbase will work mightily to maintain revenue streams, especially considering Coinbase’s current level of outside investment. These are startups that can literally affect their own value over time. We’ll talk about that shortly. Ultimately, crypto hype hasn’t been matching reality of late, a major concern to the skittish investor.

“I think that the downturn is due to things not going up as much as people had wanted. Everyone was expecting November to be a bull month,” said Travin Keith, founder of Altrean. “When things indicated that it wasn’t going that way, those who were on borrowed time, such as those needing some buffer, or those in the crypto business needing some money, needed to sell.”

Tether untethered

Tether has long been the prime suspect in the Bitcoin run up and crash. Created by an exchange called Bitfinex, the currency is pegged to the dollar and, according to the exchange itself, each tether — about $2.7 billion worth — is connected to an actual dollar in someone’s bank account. Whether or not this is true has yet to be proven, and the smart money is on “not true.” I’ll let Jon Evans explain:

What are those whiffs of misconduct to which I previously referred? I mean. How much time do you have? One passionate critic, known as Bitfinexed, has been writing about this for quite some time now; it’s a pretty deep rabbit hole. University of Texas researchers have accused Bitfinex/Tether of manipulating the price of Bitcoin (upwards.) The two entities have allegedly been subpoenaed by US regulators. In possibly (but also possibly not — again, a fog of mystery) related news, the US Justice Department has opened a criminal investigation into cryptocurrency price manipulation, which critics say is ongoing. Comparisons are also being drawn with Liberty Reserve, the digital currency service shut down for money laundering five years ago:

So what the hell is going on? Good question. On the one hand, people and even companies are innocent until proven guilty, and the opacity of cryptocurrency companies is at least morally consistent with the industry as a whole. A wildly disproportionate number of crypto people are privacy maximalists and/or really hate and fear governments. (I wish the US government didn’t keep making their “all governments become jackbooted surveillance police states!” attitude seem less unhinged and more plausible.)

But on the other … yes, one reason for privacy maximalism is because you fear rubber-hose decryption of your keys, but another, especially when anti-government sentiment is involved, is because you fear the taxman, or the regulator. A third might be that you fear what the invisible hand would do to cryptocurrency prices, if it had full leeway. And it sure doesn’t look good when at least one of your claims, e.g. that your unaudited reserves are “subject to frequent professional audits,” is awfully hard to interpret as anything other than a baldfaced lie.

Now Bloomberg is reporting that the U.S. Justice Department is looking into Bitfinex for manipulating the price of Bitcoin. The belief is that Bitfinex has allegedly been performing wash trades that propped up the price of Bitcoin all the way to its previous $20,000 heights. “[Researchers] claimed that Tether was used to buy Bitcoin at pivotal periods, and that about half of Bitcoin’s 1,400 percent gain last year was attributable to such transactions,” wrote Bloomberg. “Griffin briefed the CFTC on his findings earlier this year, according to two people with direct knowledge of the matter.”

This alone could point to the primary reason Bitcoin and crypto are currently in free fall: without artificial controls, the real price of the commodity becomes clear. A Twitter user called Bitfinex’d has been calling for the death of Tether for years. He’s not very bullish on the currency in 2019.

“I don’t know the when,” Bitfinex’d said. “But I know Tether dies along with Bitfinex.”

Le shitcoin est mort

As we learned last week, the SEC is sick of fake utility tokens. While the going was great for ICOs over the past few years with multiple companies raising millions if not billions in a few minutes, these salad days are probably over. Arguably, a seed-stage startup with millions of dollars in cash is more like a small VC than a product company, but ultimately the good times couldn’t last.

What the SEC ruling means is that folks with a lot of crypto can’t slide it into “investments” anymore. However, this also means that those same companies can be more serious about products and production rather than simply fundraising.

SEC intervention dampens hype, and in a market that thrives on hype, this is a bad thing. That said, it does mean that things will become a lot clearer for smaller players in the space, folks who haven’t been able to raise seed and are instead praying that token sales are the way forward. In truth they are, buttoning up the token sale for future users and, by creating regulation around it, they will begin to prevent the Wild West activity we’ve seen so far. Ultimately, it’s a messy process, but a necessary one.

“It all contributes to greater BTC antifragility, doesn’t it?,” said crypto speculator Carl Bullen. “We need the worst actors imaginable. And we got ’em.”

Bitmain

One other interesting data point involves Bitmain. Bitmain makes cryptocurrency mining gear and most recently planned a massive IPO that was supposed to be the biggest in history. Instead, the company put these plans on hold.

Interestingly, Bitmain currently folds the cryptocurrency it mines back into the company, creating a false scarcity. The plan, however, was for Bitmain to begin releasing the Bitcoin it mined into the general population, thereby changing the price drastically. According to an investor I spoke with this summer, the Bitmain IPO would have been a massive driver of Bitcoin success. Now it is on ice.

While this tale was apocryphal, it’s clear that these chicken and egg problems are only going to get worse. As successful startups face down a bear market, they’re less likely to take risks. And, as we all know, crypto is all about risk.

Abandon all hope? Ehhhhh….

Ultimately, crypto and the attendant technologies have created an industry. That this industry is connected directly to stores of value, either real or imagined, has enervated it to a degree unprecedented in tech. After all, to use a common comparison between Linux and blockchain, Linus Torvalds didn’t make millions of dollars overnight for writing a device driver in 1993. He — and the entire open-source industry — made billions of dollars over the past 27 years. The same should be true of crypto, but the cash is clouding the issue.

Ultimately, say many thinkers in the space, the question isn’t whether the price goes up or down. Instead, of primary concern is whether the technology is progressing.

“Crypto capitulation is once again upon us, but before the markets can rise again we must pass through the darkest depths of despair,” said crypto guru Jameson Lopp. “Investors will continue to speculate while developers continue to build.”

News Source = techcrunch.com

1 2 3
Go to Top