You are reading the article How A Nasa Open Source Startup Could Change The It Universe updated in December 2023 on the website Katfastfood.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested January 2024 How A Nasa Open Source Startup Could Change The It Universe
Tang and velcro aren’t the only things that NASA helped to invent that are part of our modern world. NASA has also played a pivotal role in the emergence of cloud technology that could reshape the vast IT world here on Earth.
Chris Kemp, the first CTO of IT at NASA, helped to lead an effort at the U.S. space agency to create a cloud compute platform. The original 9,000 lines of code, known as Nova, have become the cornerstone of the OpenStack cloud project. OpenStack now has the backing of major IT vendors, including IBM, Dell, HP, Cisco, AT&T and Intel. Kemp left NASA in 2011 and is currently the CEO of Nebula, a startup that is set to deliver commercially supported OpenStack gear.
In an exclusive interview with InternetNews Kemp explained how the open source cloud effort came to be and how NASA is already reaping the rewards.Inspired by Google
When Kemp started at NASA back in 2006, he helped to lead the partnership with Google. It’s an experience that shaped his thinking about the availability of compute resources.
“It was through working with Google that I came to realize how much of a force multiplier it was to give every one of your employees access to infinite compute and storage resources,” Kemp said. “That’s really what the people that work at Google feel that they’ve got.”
In contrast, at the same time NASA engineers were somewhat budget restrained in terms of compute resources. So Kemp wanted to see what would happen if those engineers were given access to a large pool of compute and storage infrastructure.
“Incredible things happened,” Kemp said.
Kemp and his team at NASA then replicated the system so it was in use by hundreds of groups. It even caught the eye of the White House.Startup CEO
“I’ve always been in a founder, CEO role,” Kemp said. “When I was at NASA I always viewed my role there as an entrepreneur and I viewed my team as a startup team.”
Kemp motivated his team of NASA engineers the same way that he would a team at a startup company. He tried to find people from across the organization and motivate them by giving them something important to do. He worked to make sure they were aware that their contributions mattered.
In taking the NASA cloud computing project forward into the open source world as part of OpenStack, Kemp realized that even more could be done. NASA partnered with Rackspace in July 2010, to form the OpenStack project.
“With NASA we had the opportunity to inspire a community,” Kemp said. “A lot of OpenStack’s success has to do with people that deep down inside were inspired by the space program and wanted to contribute code to a project that they knew would help NASA to explore the solar system.”NASA’s Most Successful Project
Today OpenStack is the backbone of cloud offerings from Cisco, Dell, HP and many others in the emerging cloud market. The project has become so successful, in fact, that NASA no longer needs to be actively involved.
NASA continued to make code contributions as part of the OpenStack effort, until fairly recently.
“NASA spends billions of dollars trying to commercialize technology, it tries to take all of the money we invest in the space program and translate it into value for the average American,” Kemp said. “OpenStack could possibly be one of the most successful spinoffs in the history of NASA.”
OpenStack’s destiny is to reshape the $3 trillion IT market, enabling a whole new era of companies to innovate and empower people. The success of OpenStack has already benefited NASA: instead of having to fund development, NASA can now simply just buy OpenStack services.
“The Nova/Nebula team had gotten this thing so far, so quickly that it (NASA) can now buy it from Dell or HP or a whole host of companies,” Kemp said. “NASA didn’t need to be in this role of incubator anymore, it could do what it normally does, which is to procure the technology from the commercial ecosystem.”
In Kemp’s view, NASA did the right thing by stepping back from the project and letting the private sector step in with OpenStack.
Sean Michael Kerner is a senior editor at chúng tôi the news service of the IT Business Edge Network, the network for technology professionals Follow him on Twitter @TechJournalist.
You're reading How A Nasa Open Source Startup Could Change The It Universe
ObjectsSearch Offers Open Source Search Results
Aiming to provide users with unbiased website ranking, chúng tôi has launched an open source search engine based on Nutch.org’s search. ObjectsSearch looks to solve problems related to search result manipulation and information overload. ObjectsSearch claims that their open source approach provides an “alternative to commercial web search engines. Only open source search results can be fully trusted to be without bias.” This premise goes against the major search engines in the methods in which they rank search results.
“All existing major search engines have proprietary ranking formulas, and will not explain why a given page ranks as it does. Additionally, some search engines determine which sites to index based on payments, rather than on the merits of the sites themselves. ObjectsSearch, on the other hand, has nothing to hide and no motive to bias its results or its crawler in any way other than to try to give each user the best results possible.”
Each result that appears on OS’s results page contains four different links. They have cached links, which displays the page that OS downloaded. Results have an explanation link that describes how the site received its ranking, the result links feature an anchor link that shows a list of incoming anchors that have been indexed for the page in question and finally plain text link – displays the plain text version of the page that ObjectsSearch downloaded.
While Vivisimo’s Clusty is grabing headlines about the ability to cluster search results, ObjectsSearch uses an interesting cluster method in its results. In fact, OS claims to be the first search engine “which cluster[s] its own search results unlike other meta search engines, which get their search results from other search engines.” To accomplish this, OS uses what’s known as a “Clustering Engine”.
A description of OS’s approach to search results and clustering appears on their about page: “one approach is to automatically group search results into thematic categories, called clusters. Assuming clusters descriptions are informative about the documents they contain, the user spends much less time following irrelevant links.”
It’s always interesting to see what the little guys like Gigablast, IceRocket, and ObjectsSearch are trying out in the search field because just because they don’t have the funds to set up a city of International developers working on new features day in and day out, they do have gumption and the flexibility to test out new stuff on the fly.
Dell Wednesday expressed its faith in open-source computing like never before and expanded its pacts with vendors Oracle and EMC .Oracle Chairman and CEO Larry Ellison joined Dell CEO Michael Dell onstage at the Pierre Hotel to discuss how the two companies had inked a global sales agreement to implant Oracle databases and application servers on Dell servers in Europe and Asia, and create cost-effective servers powered by Oracle software for small- and medium-sized businesses (SMBs).
Specifically, Dell will craft server and storage platforms primed for Oracle 9i Database with Real Application Clusters for both Red Hat Linux Advanced Server and Microsoft Windows environments. This cluster computing, which allows multiple computers and systems on a network to work as one, is aimed at prying customers from using, say, IBM mainframes that do not scale the way the cluster system employed by Dell using Oracle software would. The firms said the systems will start at $18,000.
While that was the meat of the news, Dell spent the bulk of his time on stage discussing the total cost of ownership benefits and return-on-investment mantra of how much the enterprise sector is endorsing open source software — and by extension how much money it is saving — to power their information technology infrastructure.
Dell used a presentation peppered with a number of studies from such research firms as IDC, Gartner and Meta Group detailing estimates, figures and predictions as to how much more the enterprise customer will benefit from going to open source — especially Dell servers powered by Oracle software — as opposed to proprietary Unix-based systems. He cited statistic after statistic about how, driven by customers’ desire to have powerful systems at as low a cost as possible, standards-based products were winning the day over proprietary forebears from the likes of IBM, HP and Sun.
“[Dell] are the only large computer systems maker employing standards-based systems,” Dell said. When it came time to discuss cluster computing, Dell called Ellison to the stage.
Ellison peppered his address with anecdotes, some of them humorous, but his message was clear: Oracle wants to help Dell drive out proprietary, Unix-based systems made and sold by rivals such as IBM, HP and Sun in favor of Linux-based cluster computing.
Calling the database the “choke point” in the enterprise data center, Ellison said the point is to get information to put into the database to help people do their jobs. Over time, he said, this role hasn’t changed, but the depth and breadth to which people need to access information has increased. Because of this, the IT world requires software and systems that provide more performance and more reliability. The catch — something Ellison said is his company’s great challenge — is that they want to spend less. Then he peppered his cluster computing speech with stabs at IBM’s database, mainframes and “on demand-computing” strategy.
“When IBM wants to show off their fastest new Unix computer,” Ellison said, “Oracle is the database they use because the benchmarks all show we’re the fastest.”
He then went on to explain how IBM’s DB2 database is not fast enough, is too expensive, and how IBM mainframes and Unix boxes have a single point of failure. Moreover, if an enterprise needs more power, it needs “to throw the old machine out and get a bigger one.”
With Oracle’s Real Application Cluster technology running on Dell servers, Ellison said the grid aspect makes it possible for a single system to fail, but the network will experience no downtime because another one steps up to takes it place. This, he said, saves companies time and money. He cited his own company, saying Oracle employs 24 server for its operations worldwide. If one goes down, the other 23 step up to take its place.
Ellison did not go unchallenged in the question and answer period. Illuminata Analyst Gordon Haff posed the question that, seeing as how Oracle is stressing that proprietary systems will buckle under increasing pressure from the open source community, was he not concerned that the same might happen to the database realm?
Ellison said no, because his company’s databases are some of the most secure software applications ever composed. If the data within in is lost, unlike an operating system failure, it cannot be recovered. He claimed Oracle hasn’t had one of its databases cracked in a decade.
“The database is the last piece of software that faces a threat from open source,” he said.
Dell and Oracle have been partners since 1998 and Dell most recently served as Oracle’s launch partner for its “Unbreakable” Linux strategy. The pair will also team on a migration program to lure Unix customers to Oracle9i Database with Real Application Clusters on Dell systems.
Dell said the new CX200 systems are perfect for lightweight applications. The modular architecture of Dell EMC storage arrays supports this.
Next year, if all goes according to plan, Red Hat will become the first open source software company to generate more than US$1 billion a year in revenue. It will be a watershed moment for the open source community, who have long seen their approach of community-based development as a viable, even superior, alternative to traditional notions of how software should be written.
Certainly, open source has left the proprietary software world in turmoil over the past few years, as Linux, the Apache Web server, Perl, Apache, Hadoop, OpenOffice, GIMP and dozens of other programs put the pinch on their commercial counterparts. But what are tomorrow’s open source heavy hitters? Here are five projects to watch closely in 2012. They may form the basis for new businesses and new industries. Or they may just capture the minds of developers and administrators with some easier, or at least less expensive, way of getting the job done.
For the better part of the last decade, the choice for Web server software has been pretty stable. Apache has been used on the majority of Web servers while Microsoft’s IIS (Internet Information Services) is used across many of the rest. Over the past few years, however, use of a third entrant, Nginx (pronounced “engine-x”), has been on the rise, thanks to the software’s ability to easily handle high-volume traffic.
Nginx is already run on 50 million different Internet domains, or about 10 percent of the entire Internet, the developers of the software estimate. It is particularly widely used on highly trafficked Web sites, such as Facebook, Zappos, Groupon, Hulu, Dropbox, and WordPress. Not surprisingly, the software’s creator, Igor Sysoev, designed Nginx in 2004 specifically to handle a large numbers of concurrent users — up to 10,000 connections per server. “It is a very lean architecture,” said Andrew Alexeev, a co-founder of a company that offers a commercial version of the software, called Nginx.
The upcoming year promises to be a good one for Nginx. Last year, Nginx got $3 million in backing from a number of venture capital firms, including one supported by Dell CEO Michael Dell. It partnered with Jet-Stream to provide Nginx for that software vendor’s CDN (content delivery network) package. It also is working with Amazon to streamline Nginx for the AWS (Amazon Web Service) cloud service.
“We’re not talking about [using OpenStack to run a] cloud of 100 servers or even 1,000 servers, but tens of thousands of servers. Other options out there aren’t really considering that scale,” said Jonathan Bryce, chairman of the OpenStack Project Policy Board.
The core computational components of OpenStack were developed at NASA Ames Research Center, for an internal cloud to store large amounts of space imagery. Originally, the NASA administrators tried using the Eucalyptus software project platform, but found challenges in scaling the software to the required levels, according to Chris Kemp, who oversaw the development of the OpenStack cloud controller when he was CIO of NASA Ames.
To aid in wider adoption, OpenStack is being outfitted with a number of new features that should make it more palatable for enterprises, said John Engates, chief technology officer for managed hosting provider Rackspace. One project, called Keystone, will allow organizations to integrate OpenStack with their identity management systems, those based on Microsoft Active Directory or other LDAP (Lightweight Directory Access Protocol) implementations. Also, developers are working on a front-end portal for the software as well. Rackspace, which first partnered with NASA to package OpenStack for general usage, is also spinning off the project as a fully independent stand-alone entity, in hopes that it will be an attractive option for more cloud providers.
“2011 was the year for building the base of the product, but I think 2012 is where we really start to use that base for a lot of private and public clouds,” Engates said.
The past year has seen the dramatic growth in the use of nonrelational databases, such as Cassandra, MongoDB, CouchDB and countless others. But at the NoSQL Now conference, held last September, much of the buzz surrounded a still unreleased data store called Stig. With any luck, we will see Stig in 2012.
Stig is still a bit of a mystery, as it hasn’t been actually released yet. But observers are predicting it could fit a niche in the social networks and other applications that keep a wide range of data. The needs of social networking services are inherently different from other types of jobs, and would benefit from a database attuned to its needs, Lucas explained. “You can’t be a relevant service in this space without being able to scale to a planetary size,” he said.
“What I did see looked very interesting,” said Dan McCreary, a semantic solutions architect for the Kelly-McCreary & Associates consulting firm. He praised the database’s functional language architecture, which should ease the deployment of the database across multiple servers.
Linux Mint is designed specifically for people who just want a desktop OS, and who don’t wish to learn more about how Linux works (i.e. non-Linux hobbyists). This approach makes installing and running the software easy and maintenance pretty much a nonissue. Even more than Ubuntu, Mint emphasizes easy usability, at the expense of not using new features until they have proven themselves trustworthy.
For instance, Mint eschews the somewhat controversial Unity desktop interface, which Canonical adopted to more easily port Ubuntu to mobile platforms. Instead, Mint sticks with the more widely known, and more mature, Gnome interface.
Such rigorous adherence to usability may be helping Linux Mint, much to the detriment of Ubuntu, in fact. The Linux Mint project claims its OS is now the fourth most widely used desktop OS in the world, after Windows, Apple Mac and Ubuntu. Over the past year, Mint has even usurped Ubuntu as the distribution that generates the most page views on the DistroWatch Linux news site, a metric generally thought to reflect the popularity of Linux distributions. No doubt 2012 will see only more growth for the OS.
Could Red Hat revolutionize the world of storage software in much the same way it revolutionized the market for Unix-based OSes? In October, Red Hat purchased Gluster, which, with its GlusterFS file system, makes open source software that clusters commodity SATA (Serial Advanced Technology Attachment) drives and NAS (network attached storage) systems into massively scalable pools of storage. Red Hat plans to apply the method it used to dominate the market for Linux OSes for the storage space as well.
According to Red Hat’s Whitehurst, the storage software market generates $4 billion in revenue annually, though that’s not why the company was interested in the technology. Instead, Red Hat was interested in finding a storage technology that would make cloud migrations easier. “We look for places where open source would be particularly powerful as a way to innovate, and we look for areas in the stack where we think we can monetize,” he said. “There are not other solutions like that out there.”
Maybe I’m missing something, but I don’t see the point of proprietary network services (or cloud computing, or Software as a Service, if you prefer). Not when you have Free software as an alternative (“Free,” in this case, being analogous to open source or GNU/Linux).
In fact, proprietary network services strike me chiefly as a way to offer the incidental features of Free software without the provider giving up control. But in the last year, I’m glad to say, this dodge has started to become less tenable, as Free software has started to focus on network services.
Oh, I understand why developers might have enjoyed the idea, back a couple of years ago when network services were new. I may not be a developer myself, but I understand how the challenge of the delivery model might add interest to your work, at least before it became commonplace.
Nor would I suggest that network services should never have been developed. Diversity never hurts, and if a technique is plausible, someone is going to develop it. I accept that.
How about centralized bug-fixes and updates? Through the repositories and package management systems of Free operating systems, Free software offers those, too.
The same is true with privacy concerns. Although encryption is starting to be offered by some network services, not only can you do far more to secure your data on a local network or workstation, but, with Free software, you can scrutinize the code and satisfy yourself that no back doors exist for intruders. You don’t have to trust the provider, because you can take steps for yourself.
Just as important, many network services have fewer features than their local counterparts, particularly those for office productivity. No doubt part of the reason is that local applications are more mature, but another seems to be that Web apps jettison features in the interests of faster transmission. As a result, network services can be especially frustrating if you’re a power user, since many of the features you rely on for speed and efficiency simply aren’t available in them. Frankly, given a choice between ajaxWrite and chúng tôi or ajaxSketch and the GIMP, who in their right minds would choose the Web app?
Surely nobody with serious work to do. Such choices would be like insisting on working in a text editor or a paint program when more mature applications are available.
The only reason I can see for clients preferring proprietary Web apps (aside from the fact that they’re trendy) is that software as a service is less of a stretch for the average managerial mind than Free software. Even today, many find the idea of Free software a challenge to standard business practices, because it requires rethinking software procurement, supplier relationships, and, at times, existing business models. By choosing network services, a convention-bound company can often get the use of cost-free software (just as they could with Free software), but without having to worry about any of the mind-stretching aspects that go along with it.
Besides, outsourcing services is something that modern businesses do all the time.
But the ones who really benefit from network services are the suppliers. Unlike traditional software providers, their support costs are lower because most of the maintenance is centralized. Even more significantly, they can protect their so-called intellectual property without adopting a Free license. Furthermore, they can do so while offering — at least to casual or light users — what many outsiders consider the dominant feature of Free software: Availability at no charge.
The OpenStack Folsom release is now available, providing users of the open source cloud stack platform with new compute, storage and networking innovations.
Folsom is the second OpenStack release in 2012, following the Essex release, which debuted in February. Folsom is also the first release of OpenStack made under the auspices of the newly minted OpenStack Foundation.
Protecting your company’s data is critical. Cloud storage with automated backup is scalable, flexible and provides peace of mind. Cobalt Iron’s enterprise-grade backup and recovery solution is known for its hands-free automation and reliability, at a lower cost. Cloud backup that just works.
SCHEDULE FREE CONSULT/DEMO
The OpenStack Foundation now has the support of some of the largest IT companies in the world, including: IBM, Dell, HP, Cisco and AT&T as well as the three largest Linux vendors, Ubuntu, SUSE and Red Hat.
“Folsom is a huge release,” Brian Stevens, CTO of Red Hat told Datamation. “Every release is always better in the land of open source in terms of features and stability.”
OpenStack is not just one large homogenous project, but rather a grouping of multiple sub-projects. At the core is the Nova compute project, which has been further stabilized and improved for the Folsom release.
“Nova was one of the net new pieces when OpenStack launched,” Stevens said. “Nova has had a lot of active development on it and now OpenStack developers have been putting Nova in the hands of end users. And they are getting things done.”
Stevens noted that Nova is now being used in production and there has been a lot of deployments among what he termed as ‘smart end-users’. In the Folsom release, Stevens doesn’t see any large architectural shift with Nova, rather just the benefits of time and effort by developers.
When the OpenStack project first launched it was all about compute with the Nova compute project and the Swift storage project. In the Folsom release that is expanding with the official inclusion of the Quantum networking project.
Quantum first appeared as a technical preview in the OpenStack Diablo release back in September of 2011. A core component of Quantum is the vSwitch virtual switch as well as components that enable a Software Defined Networking deployment for an OpenStack cloud.
Stevens explained that Quantum follows the overall OpenStack approach of being a framework for services plugins. He noted that while Quantum works with vSwitch, it also works with centralized controllers as well, from multiple vendors including Cisco and NTT.
“The holy grail is that when you lay out your compute and storage you need to be able to enable the network layer efficiently as well so you can build arbitrary domains for wherever you place your virtual machines,” Stevens said. “Quantum has that architecture to allow the IP connectivity to follow wherever you place virtual machines in an automated fashion.”
The Folsom release also introduces the new Cinder block storage component to OpenStack.
Steven noted that OpenStack started out with an object-based storage system with the Swift project. In his view, the OpenStack Swift system is great for storing and retrieving large blobs of data, however it’s not as efficient for use as the core data storage system for rapidly changing data. “Cinder solves a huge gap and provides robust block storage,” Stevens said.
Sean Michael Kerner is a senior editor at chúng tôi the news service of the IT Business Edge Network, the network for technology professionals Follow him on Twitter @TechJournalist.
Update the detailed information about How A Nasa Open Source Startup Could Change The It Universe on the Katfastfood.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!