Jeff Bezos' Space Company Begins Test Flights This Year

Blue Origin, the commercial space company founded by Amazon CEO Jeff Bezos, announced yesterday that it had completed acceptance testing of its BE-3 rocket engine.

The BE-3 is capable of 110,000 pounds of thrust and his powered by liquid hydrogen. Its first use will be as the propulsion system for the companys proposed New Shepard space capsule, which will carry passengers and scientific payloads on suborbital flights into space. As the company moves into its eventual goal of orbital spaceflights, the engine will be used for upper-stage rockets.

The BE3 has now been fired for more than 30,000 seconds over the course of 450 tests, Jeff Bezos said in a statement. We test, learn, refine and then test again to push our engines. The Blue Origin team did an outstanding job exploring the corners of what the BE3 can do and soon well put it to the ultimate test of flight.

Blue Origins BE-3 rocket engine being tested. (Credit: Blue Origin)

Company President Rob Meyerson told reporters yesterday that Blue Origin was going to begin unmanned test flights of the New Shepard capsule later this year, although no date was specified. And unlike other commercial space companies like XCOR Aerospace or Virgin Galactic, Blue Origin has yet to sell any tickets or announce pricing for the cost of a flight on New Shepard.

In addition to the BE-3, Blue Origin is also developing a larger rocket engine, the BE-4, which will be capable of 550,000 pounds of thrust and fueled by a mixture of liquid oxygen and liquid natural gas. Last September, the company announced a partnership with the Boeing Boeing-Lockheed Martin Lockheed Martin joint venture United Launch Alliance to develop the engine for use in ULAs next generation of launch vehicles.

Follow me onTwitterorFacebook. Read my Forbes bloghere.

Read more:

Jeff Bezos' Space Company Begins Test Flights This Year

The Large Hadron Collider Is Back In Action

(Credit: CERN - Photograph: Dominguez, Daniel; Brice, Maximilien)

Yesterday, scientists at the Large Hardon Collider successfully turned it on, injecting two proton beams moving in opposite directions into the massive particle accelerator. The particles will be travelling at a relatively low energy at first of 450 GeV so that the operators of the can ensure that everythings working as it should be. Once all systems are cleared, proton beams will be accelerated to 13 TeV, nearly twice the energy used to find the Higgs Boson.

And thats where the fun will start.

The Large Hadron Collider has been shut down for about two years as upgrades were made to its various systems. This included consolidating some of the electrical systems, adding magnet protection systems, and making improvements to its cryogenic and vacuum systems. The LHC will also be able to fire proton beams in bunches separated by 25 ms, half the time it used to take.

The improvements to the LHC along with the higher energies will allow thousands of physicists around the world including over 1,700 in the United States alone to conduct experiments to test theories that so far have only been the province of computer simulations.

We are on the threshold of an exciting time in particle physics: the LHC will turn on with the highest energy beam ever achieved, Fleming Crim, National Science Foundation Assistant Director said in a statement. This energy regime will open the door to new discoveries about our universe that were impossible as recently as two years ago.

The Large Hadron Colliders particle accelerator consists of a ring 27 km (about 16.7 mi) long. The protons are emitted into the ring (which is a vacuum), and then are accelerated using superconducting magnets that are cooled to near absolute zero: -271 degrees C. They are then sped up to nearly the speed of light and have energies added to them. The particles are then crashed together, which produces huge amounts of energies. By studying the byproducts of those collisions, physicists are able to discover new particles and learn other things about the physics of subatomic particles.

Some of the things that the Large Hadron Collider will be looking for during its next round of experiments will be more information about the Higgs boson and how it works. Scientists at CERN will also be trying to create the particles that are hypothesized to make up dark matter as well as evidence for the first supersymmetric particle.

Follow me onTwitterorFacebook. Read my Forbes bloghere.

Excerpt from:

The Large Hadron Collider Is Back In Action

Facial Recognition Company Kairos Acquires Emotion Analysis Company IMRSV

(Credit: Kairos)

Miami-based facial recognition software company Kairos announced today that it has acquired emotion analysis company IMRSV for $2.7 million. IMRSV will be folded into Kairos business structure rather than existing as a separate entity.

Prior to the acquisition, Kairos was a customer of IMRSV, incorporating their technology of emotional analysis into their facial recognition offerings.

The impetus for emotional analysis, Kairos CEO Brian Brackeen told me. Came from our customers. One of them, for example, was a bank that was using our facial recognition as a means of authentication. They came back later and said that there were scenarios when access might be desired by the right person, with the right code, but we dont want them to have it. Like if theyre anxious, maybe thats because its the day theyre going to rob the bank.

Another reason for the acquisition, Brackeen told me, was the companys developer focus. As part of the acquisition announcement, the company has also introduced new APIs and an SDK for facial recognition, emotion analysis and crowd analytics. Being able to combine emotion analysis and facial recognition means there are fewer APIs for developers to deal with, allowing them to incorporate Kairos technology more quickly.

As an added benefit, Brackeen said, the company also expects the acquisition to enable them to develop better products more quickly.

Now well have one API and code base to work with, Brackeen said. But more importantly, the larger sciences of computer vision and machine learning build on each other. This allows the improvement of both our facial recognition and emotion analysis because the synergies between the two are very strong.

Follow me onTwitterorFacebook. Read my Forbes bloghere.

More here:

Facial Recognition Company Kairos Acquires Emotion Analysis Company IMRSV

Selena Daly: Mapping the Italian Avant-Garde: Futurism in Space and Time (1909-1944) – Video


Selena Daly: Mapping the Italian Avant-Garde: Futurism in Space and Time (1909-1944)
Recorded live at the spatial@ucsb Lightning Talks on Feb. 25, 2015. See the full lineup of talks at http://spatial.ucsb.edu/lightning-talks/.

By: Center for Spatial Studies, UCSB

Read more:

Selena Daly: Mapping the Italian Avant-Garde: Futurism in Space and Time (1909-1944) - Video

Commercial Supercomputing Heats Up As Cray Sells One Of The World's Fastest Systems

Last week, Cray Cray announced that it had entered into a contract for one of its XC40 supercomputer systems to Petroleum Geo-Services, a company that provides data analysis and exploration services for energy companies to find the locations of oil and gas reserves.

PGS will be using the new system as part of their production process, using the system to analyze data that the company gathers as its explores for oil and gas resources. That makes it unique in that it wont be used, as most supercomputers are, for research and development purposes.

This is exciting for us, Crays Barry Bolding told me. This isnt for a customers R&D organization doing futures development. Its actually a production system doing their direct product. Its very similar to weather prediction where were right in the middle of things at sites around the world.

A Cray XC40 supercomputer. (Credit: Cray)

Equally interesting about this system is that when its deployed, it will be one of the fastest in the world, processing data at about 5 petaflops. After the initial press release last week, IDC released a quick research note about the announced sale.

This, to IDCs knowledge, is the largest supercomputer sold into the O&G sector and will be one of the biggest in any commercial market, the report stated. The system would have ranked in the top dozen on the November 2014 list of the worlds Top500 supercomputers.

Building one of the dozen fastest supercomputers isnt new for Cray theyve got three in the current top 12 now. But what is unique is that most of those 12 belong to government research labs or universities, not private companies. This may be starting to change, however. For example, IDC notes that overall supercomputing spending in the oil and gas sector alone is expected to reach $2 billion in the period from 2013-2018.

Cray has taken note of the commercial opportunities. Internally were investing in our infrastructure, Bolding told me. Weve been building up our sales teams and expertise in a number of segments. Weve been averaging 10% in commercial sales over the past few years but thats grown from zero.

Bolding went on to say that

We believe we can grow here because of the convergence of big data and big computing. That impacts not just government data centers, but commercial workflows, whether its energy exploration or manufacturing of jet aircraft or automated cars or social media. This convergence over the next few years is going to increase the computing needs of the commercial sectors.

Continue reading here:

Commercial Supercomputing Heats Up As Cray Sells One Of The World's Fastest Systems

MUSIC INDUSTRY: Present Shock: When Musica(TM)S Future Arrives in the Here and Now [kyle Bylin]

Has the conversation about streaming music serviceschanged in recent years? Hasmusic's future entered the absolute present? If so, how did music execs and indie artists react. Kyle Bylin, a tech writer and user researcher, explores all of these interesting questions in his latest essay.

1. Music Futurism

Ive spent several years of my life writing about the future of music listening. I love to look at the world through the lens of a music startup that has an ambition to change current listener habits and speculate on what the shift could mean if it actually happens. The greatest challenge of this pursuit is that behavioral change often takes a very long time to occur, and by the time a predicted shift begins to fully emerge, both the world and I have likely forgotten that I ever planted that flag in the ground.

I have woken up several times in the past couple of years to a news story about a music startup launch or new feature release that sounded very familiar. I look back in my blog post archive, and, sure enough, a few years earlier I predicted that this very thing might happen. So I email the writer with a hyperlink to an old blog post of mine, and then he or she updates his or her news story with an acknowledgment that I had said it first.

And then, life goes on.

There is nothing awarded for correctly predicting that some thing might happen at some point. Furthermore, it often takes several more years to learn whether a music startup or new feature will cause a behavioral shift among music listeners. There have been many cases where I hypothesized about how a specific feature would look and feel, and why it would matter, only to see that some company came to realize the potential for a similar feature and incorporated it into a part of its music website or mobile app.

Months or years later, I grab a coffee with the startup founder and ask him or her about this feature, only to find out that no one uses it. Did the company get the feature right? Could the feature have been a commercial success if it had been introduced in a different context or incorporated into another product? It's hard to know. I have heard that it can take many different implementations for a feature to catch on. Oftentimes, the company doesnt have enough time to test every possible angle. Some ideas come too early and others too late, but sometimes they arrive right on time. Timing is what every music startup must attempt to nail or defy.

Today, many versions of the future of music exist. Interestingly, I think this has decreased speculation about what this future might entail and increased concern from industry executives and indie artists about how the present will play out.

At the start of 2011, the online trade conversation about streaming music services was mainly based on anticipation and speculation: What will happen when company X does X? What will happen when Spotify finally launches in the U.S. and a free version is offered without a trial period? Will this freemium model lead to wider use of subscription music? What will happen when Apple releases a Pandora or Spotify killer? Apple has sold over 800 million iOS devices and over 800 million credit cards on file with iTunes. How about Google, Facebook, Samsung, Twitter, or Amazon? What will happen when these major tech giants decide to enter the streaming music space? Will there be a streaming music war? Who will win? As each of these hotly anticipated and highly speculated things happened, the music industrys focus shifted from the next horizon to the present moment.

A strong indicator of this shift arrived in 2012, when several indie artists published their royalty statements online and stirred up a heated debate about streaming payout figures. In sum, their blog posts and social statuses said, Look at what Pandora and Spotify pay me right now. My royalty payments are too small. We must discuss this issue right now. For months, indie artists argued with industry executives about whether they understood how to read royalty statements and if streaming payouts could ever support their careers. Most artists didnt seem to care whether they would receive more money from Pandora and Spotify as their business operations and revenue streams grew in the coming years. All they focused on was how their streaming payouts compared to their music income and whether Pandora and Spotify royalties could supplant declining physical and digital sales. Suddenly, the conversation about whether Pandora and Spotify were the future of music grew into direct criticism about whether either companys business model was sustainable.

Read more from the original source:

MUSIC INDUSTRY: Present Shock: When Musica(TM)S Future Arrives in the Here and Now [kyle Bylin]

Research Confirms That Carbon Dioxide Led To Higher Temperatures In The Past

Over the past 400,000 years, the amount of carbon dioxide in the Earths atmosphere has periodically fluctuated, and along with it, so have global temperatures. When the concentration of CO2 has increased, global temperatures have also seen an increase and vice versa.

The basic atmospheric chemistry, which as been well-studied since the 19th century, suggests that the increased concentration of CO2 is driving the increase in temperatures. All else being equal, if you introduce more CO2 into a gaseous mixture containing mostly nitrogen and oxygen, like our atmosphere, youll see more heat trapped something you can demonstrate easily in the laboratory. However, untangling cause and effect in historical events can be tricky, especially in this case, where the evidence is drawn primarily by examining evidence from Antarctic ice cores.

Some earlier studies had suggested that the increased temperatures seen over the past 400,000 years actually preceded the increase of carbon dioxide concentration. While more recent research has cast significant doubt on those findings, a debate among some climate researchers over the causal relationships has remained.

CO2 concentration over the past 400,000 years. (Credit: NASA)

That may change now thanks to a new mathematical analysis from an international team led by Egbert van Nes of Wageningen University. To develop their conclusions, the team utilized a method to detect causality in complex systems developed by George Sugihara. These methods have been successfully used to determine issues of cause and effect in ecological systems where some variables may be dependent on one another, such as the relationship between sardine and anchovy populations with ocean temperatures in the Pacific Northwest. (For details on those original methods, see this paper.)

Use of this statistical method, the authors write, allows us to circumvent the classical challenges of unravelling causation from multivariate time series.We build on this insight to demonstrate directly from ice-core data that, over glacialinterglacial timescales, climate dynamics are largely driven by internal Earth system mechanisms, including a marked positive feedback effect from temperature variability on greenhouse-gas concentrations.

In other words, this new model allows climate researchers to confirm that the known chemistry of greenhouse gasses helped drive positive feedback loops that led to increases in global temperatures. Even in a case over the past 400,000 years where the temperatures may have started to rise first, the subsequent increase in carbon dioxide helped to drive those trends upward when they might have otherwise leveled off or declined.

Our new results confirm the prediction of positive feedback from the climate models, research team member Tim Lenton said in a statement. The big difference is that now we have independent data based evidence.

Average global temperatures by decade. (Credit: World Meteorological Organization)

This new paper is significant as it provides further validation for current climate models and provides yet another resource to demonstrate the consequences of rising carbon dioxide concentrations in the Earths atmosphere. From the perspective of the last 400,000 years, were currently entering uncharted territory. The amount of carbon dioxide in the atmosphere now exceeds any concentration seen during that period of time. And the results are predictable the past few decades have seen increased average temperatures. Last year, 2014, was the hottest year on record since 1880.

Read more here:

Research Confirms That Carbon Dioxide Led To Higher Temperatures In The Past

Preterism vs. Futurism discussion with Pastors Michael Miano and Robert Iannuccilli 3/21/15 – Video


Preterism vs. Futurism discussion with Pastors Michael Miano and Robert Iannuccilli 3/21/15
BLUE POINT, NY-- Pastors Michael Miano of Blue Point Bible Church and Robert Iannuccilli of Faith on Fire Ministries hold an informative discussion on Preterism and Futurism in biblical thought...

By: Mert Melfa

The rest is here:

Preterism vs. Futurism discussion with Pastors Michael Miano and Robert Iannuccilli 3/21/15 - Video

Solving The Problem Of Scientific Reproducibility With Peer-Reviewed Video

In a story published last week in the Boston Globe, Carolyn Johnson covered one of the quiet crises facing scientific research today: the fact that .

But talk to a scientist long enough, and youll probably hear a story like this: An intriguing new discovery was reported in a research journal, wrote Carolyn Johnson in an excellent article on this topic in the Boston Globe. Maybe it was a biologist describing a new Achilles heel in cancer cells, a psychologists profound insight into human behavior, or an astronomers finding about the first moments of the universe. The scientist read about the finding and tried to confirm it in her own lab, but the experiment just didnt come out the same.

Although in some instances the cause of this is outright fraud, far more often the causes are more proasic and mundane. Not being able to replicate an experiment may just mean there was something wrong with the instruments in the initial experiment. (This quite famously happened to the OPERA collaboration, when they infamously announced that they had measured neutrinos moving faster than light. After attempts to reproduce the experiment failed, it was later revealed that the measurements had been erroneous due to a bad data connection in their instruments.)

Sometimes, though, the reason why an experiment doesnt work is because important steps were left out of the original paper, making it impossible for other scientists to replicate the experiment. Sometimes this is oversight, and sometimes its just that a particular lab has common practices that dont carryover to the wider community. In these cases, the experimenters didnt find a false result its just hard for other people to demonstrate the same thing.

Journal of Visualized Experiments founder Moshe Pritsker (Credit: JoVE)

This particular aspect of day to day aspect of scientific research was deeply frustrating to Dr. Moshe Pritsker when he was engaged in research a towards his PhD a little over a decade ago. He would find himself unable to complete experiments working just off papers, but when he was able to meet with an experimenter or visit their lab, he was able to then replicate the experiment. But this experience made him frustrated about the way science was being done in the 21st century.

Why doesnt it work? he told me. Its text. Its not good for transfer of knowledge about complex experiments. When you see people do it, youll get small details you cant get from text.

Faced with the time consuming process of visiting original labs to see how experiments were performed led Pritsker to thinking that there had to be a better way for scientists to share information than the centuries-old process of publishing papers.

Read more:

Solving The Problem Of Scientific Reproducibility With Peer-Reviewed Video