‘UFC 214 Embedded,’ No. 2: ‘Cyborg’ uses her head, Jon Jones gets on his bicycle – MMAjunkie.com

12m

News July 26, 2017 2:30 pm Steven Marrocco

The Association of Boxing Commissions today voted unanimously to add four new weight classes to the unified rules of MMA, increasing to 13 the number of recognized divisions in the sport.

57m

UFC July 26, 2017 1:45 pm Blue Corner

Eddie Alvarez and Kevin Lee arent opposing coaches on TUF 26, but that didnt prevent the lightweights from having an encounter in Las Vegas.

1hr

UFC July 26, 2017 1:05 pm Fernanda Prates

Before he joined the UFC, Paulo Borrachinha helped Vitor Belfort prepare for two fights. Now, he wants to fight the veteran, himself.

2hr

UFC July 26, 2017 12:29 pm MMAjunkie Staff

Stream or download Tuesdays episode of MMAjunkie Radio with UFC Senior Vice President and Head of Content and International Joe Carr.

2hr

Videos July 26, 2017 12:00 pm Fernanda Prates and Ken Hathaway

Geoff Neal was at work when he got the call from his coach to be part of Dana Whites Contender Series 3 on three days notice. His reaction? I told them, Im not going to be here this weekend, Im gone.

3hr

WSOF July 26, 2017 11:15 am MMAjunkie Staff

The light heavyweight division has proven an interesting case study in recent times.

4hr

Videos July 26, 2017 10:30 am Blue Corner

So, youve gone out there and won a fight at a Dana Whites Contender Series event. Now what? As we see in a backstage video with the Week 3 winners, the nervousness isnt over.

4hr

Videos July 26, 2017 10:00 am MMAjunkie Staff

Now riding a two-fight streak that includes a stellar Bellator debut, Invicta FC vet Amanda Bell has learned to balance her finishing instincts with the patience to embrace decision outcomes.

5hr

Videos July 26, 2017 9:30 am Blue Corner

Cristiane Justinos first UFC title fight takes place Saturday at UFC 214, and the first edition of Cyborgs all-access video blog, Cyborg Nation, is now available.

6hr

UFC July 26, 2017 8:30 am Mike Bohn

Saturdays UFC 214 event has a stacked card, and even the prelims are star-studded. As we learn in our UFC 214 pre-event facts, Renan Barao has absolutely flawless takedown defense.

Original post:

'UFC 214 Embedded,' No. 2: 'Cyborg' uses her head, Jon Jones gets on his bicycle - MMAjunkie.com

Tonya Evinger Considers Title Bout With Cris Cyborg a ‘Super Fight’ – MMA News

Tonya Evinger realizes the importance of her title bout with Cris Cyborg.

This Saturday night (July 29), Evinger will battle Cyborg for the vacant Ultimate Fighting Championship (UFC) womens featherweight title. Itll be Evingers UFC debut and Cyborgs first title shot in the promotion.

The action takes place inside the Honda Center in Anaheim, California and will be part of UFC 214s main card. Megan Anderson was expected to meet Cyborg for the gold, but personal issues put a halt to those plans.

During a recent media call, Triple Threat talked about her opponent being the heavy favorite:

I think Im the underdog every time, so its just one of the roles I play. I seem to get the fans that dont think Im as good a fighter as I am. I think that were both just two dominating champions. I think this is a super fight and its a fight that couldve happened a long time ago. Its just definitely one to watch.

When asked about openings she has seen watching tape of Cyborgs fights, Evinger said she isnt concerned with that.

I dont go into any fight thinking my opponent has any weakness. I just go in there to put my style out there and be offensive myself.

View post:

Tonya Evinger Considers Title Bout With Cris Cyborg a 'Super Fight' - MMA News

The Weekly Grind: Tyron Woodley freestyles live on Sway in the Morning, Cyborg and Garcia jump teammate – MMA Fighting

The life of a professional fighter isn't all glitz and glamour.

As if getting punched, kicked, kneed, choked and twisted into a pretzel on a regular basis isn't enough, fighters attract a general wackiness that makes their lives, well ... interesting.

To commemorate these day-to-day hardships, slip-ups, pranks, and more, we bring you, "The Weekly Grind."

Shining.

A post shared by Conor McGregor Official (@thenotoriousmma) on Jul 23, 2017 at 2:01am PDT

Sparring today.

A post shared by Conor McGregor Official (@thenotoriousmma) on Jul 22, 2017 at 6:55pm PDT

the force is strong

A post shared by Demetrious Johnson (@mightymouse125) on Jul 22, 2017 at 7:10pm PDT

Easy $$

A post shared by Sugar Sean O'Malley (@sugaseanmma) on Jul 18, 2017 at 9:44pm PDT

Originally posted here:

The Weekly Grind: Tyron Woodley freestyles live on Sway in the Morning, Cyborg and Garcia jump teammate - MMA Fighting

Sex robot 2018 REVEALED: New human-like feature to take cyborgs to ‘NEXT LEVEL’ – Daily Star

A SEX robot firm is making a doll with a fully functional Terminator-like head which can speak, smile and even sing in a major breakthrough for the cyborg industry.

Doll makers have created a prototype of DS Doll Robotic Head coated in silicone skin which can be controlled by a smartphone or PlayStation controller.

The model, made by Doll Sweet Dolls and EX Doll, has a stunningly beautiful face, and can listen and answer questions using voice recognition software.

But the head will only be able to speak in Chinese, although there are plans to launch a campaign to raise enough money to produce an English version.

The futuristic doll, which hits the shelves at the end of 2018, does not have a rotatable neck, but will fit all the company's other doll bodies, which have flexible limbs allowing them to move as a real person.

SWNS

Can you tell the difference between these dolls and real girls?

1 / 18

Paul Lumb, head of Cloud Climax, says the prototype is the Bugatti Veyron of VR - and his company hopes to bring the doll to the UK by the end of the year.

He said: "EX Dolls have been working on a robotics head since 2014, but we're generations away from a Terminator-style cyborg.

"What they've been doing is developing app control for the robot head, but currently with the prototype it's only got Chinese dialect, which is their core market.

"Over the next 12 months, they want to develop Japanese and English speaking models.

SWNS

Sex robot engineer Sergei Santos is intent on bringing his creations to the masses. We take you inside the Chinese factory where these ultra-realistic sex toys are being manufactured

1 / 25

"They will have an element of natural conversation, so they won't sound too robotic, but they will take time - languages are massive.

"The voice recognition is no different to a smart phone, but this model also has facial expressions, unlike standard silicone heads.

"Over the last two years we've been moving towards real touch, real feel silicon dolls, enhancing the skeleton to make it lighter - these adult-sized dolls have a combined head and body weight of five stone.

"It's an opportunity to take this to the next level of where doll development is going, and it's been happening quietly behind the scenes for ten to 15 years.

SWNS

Victoria Wild the 30-year-old blonde bombshell spent 30,000 GBP transforming herself into a human sex doll that wants men to worship her.

1 / 7

Victoria Wild the model spent the equivalent of 30,000 GBP turning herself into a human sex doll

It is really the Bugatti Veyron of VR

This is revolutionary, it's a game-changer - from a gamer's perspective, it'd be like a new console being released.

"This is the next step of teledildonics reality, it's at the forefront of adult tech and is really the Bugatti Veyron of VR."

Paul said the prototype represents the very latest in natural realism, but it comes at a hefty price with the dolls set to go on sale at around 4,500.

He added: "Two studio level make up artists have taken this head to the next level, it looks very natural in the face.

SWNS

"The realism of a robotic head and body together will be the next level of enhancement - the guys who work for EX Dolls have masters in robotics and are continually evolving this technology.

"From a software point of view, you don't want it breaking, and they need to get the internal components correct.

"They're also working on facial posturing, narration and interaction.

"We're still six or seven years away from having the ability to incorporate sensor pads and electronic wiring - it'll probably be a decade before prototypes are even made.

"You just don't want a heavy doll filled with technology, it can't really be an actual sex robot - it'd be too expensive.

"The skeleton can be posed and moved, and the head can be interacted with, but there's not full body movement and it's just an enhancement to what we're currently selling.

"This is the ultimate, really - I'd be very surprised if we see a fully autonomous body soon, it's too future tech and that'll be 15 to 20 years down the line."

View post:

Sex robot 2018 REVEALED: New human-like feature to take cyborgs to 'NEXT LEVEL' - Daily Star

New Jersey Issues Advisories, Closes Some Beaches After Bacteria Turns Up in Water Testing – NBC New York

WATCH LIVE

(Published 4 hours ago)

A dozen New Jersey swimming beaches are under a bacteria advisory and nine others are closed after test results showed rising levels of bacteria.

The state Department of Environmental Protectionissues an advisory when a sample exceeds the state standard for the presence of entercocci, a type of bacteria found in animal and human waste.

A Foodie's Guide to the Perfect New Jersey Staycation

Swimming advisories warn the public of potentially unhealthy water conditions, and additional sampling is conducted until water quality results are again within standard.

Beaches are closed if two consecutive samples collected at a bathing beach exceed the state standard. Beach closings remain in effect until subsequent sampling indicates bateria levels are again below the standard.

The closure applies to water activities like swimming, wading and playing in the water. Other beach-related activities like sunbathing and walking on the beach are unaffected.

Massive Crowds Gather in NJ ... for a Grocery Store?

The following beaches are closed:

ATLANTIC COUNTY

OCEAN COUNTY

ATLANTIC COUNTY

Atlantic City

MONMOUTH COUNTY

Deal Borough

Highlands Borough

Long Branch City

Middletown Township

Sea Bright Borough

Sea Girt Borough

Spring Lake Borough

OCEAN COUNTY

Barnegat Light Borough

Point Pleasant Beach Borough

Long Beach Township

Toms River Township

Published 5 hours ago | Updated 1 minute ago

Read the rest here:

New Jersey Issues Advisories, Closes Some Beaches After Bacteria Turns Up in Water Testing - NBC New York

Two Cayuga Co. bathing beaches closed by algal blooms back open – CNYcentral.com

CAYUGA COUNTY, N.Y.

Two bathing beaches along Cayuga Lake have reopened after being closed due to the presence of harmful algal blooms, according to the Cayuga County Health Department.

The beaches at Wells College dock in the village of Aurora and Frontenac Park in the village of Union Springs were closed last week when the blooms were detected. The health department announced Wednesday that the beaches were back open to the public.

The Health Department says it monitors the water quality at public beaches routinely and closes swimming areas when a potential hazard is identified.

The algal blooms are described as paint-like or give the water a filmy appearance and should be avoided. If any are spotted, the public is encouraged to report them to the state Department of Environmental Conservation at HABsinfo@dec.ny.gov.

If you have any questions about algal blooms, you can call the Cayuga County Health Department at 315-253-1560.

Read the original post:

Two Cayuga Co. bathing beaches closed by algal blooms back open - CNYcentral.com

Why great white sharks keep coming close to California beaches – USA TODAY

USA Today NetworkCheri Carlson, Ventura County (Calif.) Star Published 4:48 p.m. ET July 24, 2017 | Updated 5:05 p.m. ET July 24, 2017

For the past 10 years, researchers have tagged juvenile white sharks off Southern California. They found a half dozen hot spots for shark nurseries just off the coast. Cheri Carlson/The Star

Chris Loeb of the California State University-Long Beach Shark Lab releases a juvenile white shark off the coast of Southern California(Photo: Shark Lab at California State University-Long Beach)

VENTURA, Calif. A nursery for great white sharks sits just off Ventura and Oxnard two of a half dozen hot spots along the Southern California coast.

Chris Lowe, a professor at California State University-Long Beach, and his team at the university's Shark Lab searched through close to 100 years of fishery records and identified hot spots for the juvenile sharks. The state started keeping shark records long before the Discovery Channel's Shark Week, which began in 1988 and is having its 30th anniversary this week, had its debut.

About 10 years ago, they began taggingyoung white sharks along the coast to try to confirm those findings. They did.

With the acoustic transmitters that we implant in the sharks, they have to swim within 300 yards of one of our underwater receivers, Lowe said. We have those all along the coast in the Southern California.

Monday: Youre way more likely to die from these than a shark attack Sunday: Fans upset that Michael Phelps didn't race a real shark

What they found was that the young white sharks stayed in the areas Dana Point, Huntington Beach, Oxnard, Santa Monica Bay and Ventura over the summerbefore working their way down the coast and heading to the Baja Peninsula for the winter.

After the winter, some of those young sharks head back to Southern California the next summer. Some do it over and over again, Lowe said.

Weve seen the number of babies in Southern California steadily increase over the last 10 years. Thats a sign that ... our coastal ocean is getting healthier.

"The sharks that we tag tend to hang out mainly at those hot spot areas during their first few summers," he said.They are safer areas for the juveniles ones away from predators and full of easy-to-catch sting rays.

While the young sharks come back to Southern California, they don't always pick the same spot.Why they pick one over another is one of the questions Lowe would like to answer.

It's still too early in the season to say whether they will hang out off the Ventura County coast this year, but in the past month people have seen them.

In June, state lifeguards posted signs along the beach after authorities confirmed a sighting of a juvenile shark near Ventura Pier.

The beach wasn't closed, which follows a protocol used throughout the area. The signs just acted as an advisory, said Tyson Butzke, Ventura sector supervisor for California State Parks.

The shark, about 6 to 8 feet long,didn't show any aggressive behavior, and the advisory was lifted the next day.

For 20 years, Ventura Harbormaster John Higgins said he hadn't seen any sharks close to shore in the area.

Friday: Shark expert breaks down Phelps's chances in race July 12: How common are shark attacks?

Then in 2015, surfers, harbor patrol and others saw them regularly just off the coast.

"I could go there almost every day and see them," Higgins said of a spot just south of the harbor beaches.

Beach-goers enjoy a warm day near the Ventura Harbor South Jetty in Ventura, Calif.(Photo: Chuck Kirman, Ventura County (Calif.) Star)

Since then, "we have been kind of invested in knowing about these sharks," he said.

Higgins and others have worked with Lowe to learn more about the sharks and helped set up receivers to help track the tagged sharks.

Now, harbor patrol officers specifically look for the juvenile sharks, and agencies have worked together to share information.

"They are just juvenile white sharks. They're not the great white shark that you see on 'Jaws' or as part of Shark Week," Higgins said.

A newborn great white shark measures 4 to 5 feet long. They grow about a foot annually for the first three or four years.

Weve seen the number of babies in Southern California steadily increase over the last 10 years, Lowe said.Thats a sign that were doing some things right and that our coastal ocean is getting healthier."

Most of the sharks along the beaches in Ventura and Oxnard are between a few weeks and a few years old.

July 6: Fisherman swims with a whale shark off Maryland coast June 8: Man films shark bite while spearfishing

They tend to stay closer to shore, eating stingrays and fish along the bottom of the ocean.

Aggressive behavior is rare, but people should leave them alone, Lowe said. Like any wild animal, they will defend themselves if threatened.

Follow Cheri Carlson on Twitter:@vcCheri

Autoplay

Show Thumbnails

Show Captions

Read or Share this story: https://usat.ly/2v1c3VZ

Read more:

Why great white sharks keep coming close to California beaches - USA TODAY

Swimming advisories lifted for three Presque Isle beaches – GoErie.com

Swimming advisory lifted for Beach 1, Barracks Beach and Beach 6; precautionary advisory remains for Beach 9.

Presque Isle State Park officials have lifted a swimming advisory that was issued for Beach 1, Barracks Beach and Beach 6 on Tuesday and issued a precautionary swimming advisory was issued for Beach 9.

The advisory, which was placed because of elevated E. coli bacteria counts, was lifted on Wednesday at 11:45 a.m.

According to Erie County Department of Health protocol, if E. coli bacteria counts are between 235 and 999 per 100 milliliters of water, a swimming advisory is posted for a beach. The precautionary swimming advisory will be lifted when there is no longer an environmental and public health concern, officials said.

Swimming is allowed at beaches with advisories, but officials recommend that swimmers avoid swallowing lake water and avoid swimming with open cuts or wounds. Beach users should wash their hands before handling food. Beaches under a precautionary swimming advisory are still open to the public for swimming, sunbathing and other recreational opportunities.

Presque Isle State Park has 13 regulated swimming beaches.

For information, call the park office at 833-7424.

Go here to read the rest:

Swimming advisories lifted for three Presque Isle beaches - GoErie.com

New Horizons’ next target: spotted – Astronomy Magazine

NASAs New Horizons spacecraft changed our view of the outer solar system forever when it flew by Pluto in 2015. Now, its on its way to the next destination: a Kuiper Belt object (KBO) known only as 2014 MU69. Although the spacecraft wont reach its target until New Years Day in 2019, NASA is already looking ahead to learn as much about 2014 MU69 as possible, thanks to a convenient temporary alignment that recently allowed the object to pass in front of a background star.

The passage, called an occultation, occurs when objects line up in the sky as viewed from Earth. When an object, such as an asteroid, planet, dwarf planet, or KBO, passes in front of a distant star, astronomers can watch the way the starlight dims and returns to gain information about the object passing in front of it. This information can include size, shape, and even whether the object possesses rings, moons, or an atmosphere.

The recent occultation was visible from the Southern Hemisphere; the New Horizons team used 24 mobile telescopes in Argentina to view the event, which lasted only about two seconds. This effort, which thus far has yielded five successful occultation detections, is vital to the characterization of 2014 MU69 before New Horizons arrives. Thats because this tiny, distant object is poorly understood; currently, its believed to span about 14-25 miles in diameter (22-40 kilometers), but little else is known about its shape and composition thus far.

See the original post:

New Horizons' next target: spotted - Astronomy Magazine

These are the most breathtaking astronomy photographs of the year … – Evening Standard

rI(4.LH.7IDT*II&2S2k_/||]y}>d="2#pHJXU@,G?ol;ad&n`I_A' ~=$ `0da?H^~.U K*g;g&G 9+r9@ll3 ,3F D-IhO{V Du<7NH J*R?"!FA(dO6v1SE 6s'Q;>|WR$)qb]nN% z _VZxZ"J&"p>{*M;#xu#[Xebg6pD)~lX[ LYNo>T _&rbXmwDXOZk,ut"i@ A,d,!G48??')H8-5sJ = J@cCsMmx#A?1ciCC"9sf=SrN@S{&-VUnX'|M#9!Dr8gcRcDWd1CFl/u`dF63MThQ:,b Qn` TQ8.0Ct-'|X 4i/9%*(Qo~m>.wGVc& jcev5 jmZ_YU[7oVz1E]iXKf6][EQ:jZ[X JMQi7s/_R3-Ad[jZkZ[VCpM0`G=ntc+L;k:>w=0 ].UJU(.pZJqM!AG#0VS17 /sv%EK/@X^m^0Jh5@.4ldF@~A]F1GNaldl8]~%&ZIJU*(q{ jZKzYP2&'Bj42n6WF/ml4Zfkci/laXi]QJ%~Q!a,`eSn/yg_O"67o(rE{<:G'Aj'[V5|'(KSbHT]a/"WzeEdYp2:1|Lj.Exqbq007{w>>WVFN>mZ^,Jk[; 9"2d} U/e4dK|zG7x{EQ&ks8g`){"9cn^eX Nh u[/d|yD>{zx`o>f[EVm]N]:m)U`Q^cE0;A-fIiUk]Sk-a3Fpfa hk!@f_NAn^@8 V}+4X^[9s'9N|G(N@X}hDv-Te B1Qy@uCT)YZA*yGq5YVAG[w )^e$IM v1@HLfQ^ anewi0zgVmN?o:O{Qo_WVvU{wOgI?wW?OO/vE=~eLs#.My3R`6^ooll5E'#IK.YIB[<3zWB*kLWDmAeq'D nC_O)LrG^U&m;5 Oo/|}qpeO>v/tyi^YtY=|wCu5y|vF`,yr]xKwW-+s0<}^_wwwg>zj{efo]mKN+:I@Y'mz8'/wn{$LT@? B3}gNf c~c(ux,,>TJ>^@ |'#@(Q"G~+zPTsWgFW9rwW=B[o*0N@-#dV?oP+qW4Z]Wx*oY%>5@9t]|>K7Qmg=np%6;u'SL LUMe1.y"LBX*?FI7a0&rc[Z@P'_#Dz 2[j/a5Z+/_wRM &0NFChf]67f(O"~bLIqcq)z|VTQ/rrW'jznoAsSEZ<+ryuK8d)?fWee0_O0g`nP!J'f^wo&HXmxd|F6O@Iy]f{O$vKT  k4E0FAtwhTs;^3rtWZ|}#]K'..OS>q.{nu]I$s%|gs^0el0O|p{ .#(D_&@E _$|?:{v >%LqFn.qpfj ]Qe#)tc >RQ sDzL{bppVZa@*m@W:O>RVQAP(L AL`VfBN >]Ud|q)|?_n?vZ(7j:CD@']X@$oGV/L} cs>1i7rBgXWBEC PQ/}Dr5j}"jkOc+?k SeQ7,nxqY%9y@QkX/C/Z Ij4IHC?h%"89C(0v_qrnsEP|0)b%_>G.pMntoem E9LG@O {q 0K1oWoONfziwqk|j%ov7AY!x>fpu a}D~<`wueT70mE'#80+cvPs}`jwkghf+ujeBkq?=j"w[EC2T[,HF/fG*'3'b"q'p?H.D'wWjdH~;>845md2Y+u#wbUEIWLl!Kf=r0 YrrQ9;nY/0I@so=%7m_P_QOGQF 5;xE M&SnQ+2)HbtY ON~_m'.P DD&k[f6)|m4aXrl~J$h Q yqQV.tdjSY2A@&GlJ5/OfKq",?M)b7q Sn`EWTCPp}Cm*?*{Of{"c *:HLQC ld-XVthdA:Q=)Ko =>lJqeM0+(:T&N+5bnYf~ zZgT=c18 #(|m6Kt8><+^" ,s{[0e9.[|~8W'Ruw>;3,!( Fm2#_aWBfX,-tB]`gY d(v'/G$g#L}+EW6vq MdKO+",h6~F~j8"KScanwE LYS' )Jz :CLm:>nyy%O(M&)m}'ZMdoUXNi7 )<= }U46uJqEtzTH O fcN`H;QWYb5x4:1aOx':2?TM~Y-!y >|ee>@D l&WzIk;<*H>=ihx9zw?to;IRU6k.+MTj*5?fz eg`4v&q~I hr5/oy][lVuY.~#i?c_nis_ahznZ+K+XUha U?#_9Ygz<@UVb9x]s4qA.*][R(Rz;`4H-un7LYs_qx&JY{Si:' []X#(U~0lN+{FJ`3)UP*TKzRkU2uWoVI9BkH4T2;JR-6jvcRsN`(fh O67$Ts|f07Y^#)zT^wwUMKl?WV,y=)|,*-^q6MYd}lMhyGv"RxO}cK!w(|j^z d(jY-R)lbg&"ek%Emm~u=Bq!)VH1zpmJmd8^-2i|Fuio1iNAnFn4Tqa#Vhcg3f7r?*9uh@xl'cq6x,` hAT:rg#z'6Y)zm-lq|Fg~WckS` K"OX }iRWLZw,Mc)ODb'd$Qq~89Qq~88Y~8~T,??*NVN?*Nj"2aMYEwe)?,R5H)Hh?l;Zh}Wr(l^So:.xF"hAb|[uHN'iuV Y;33#]?04 "'eAv-&H' 8N#Jel ;d ,4p?>U MAA$pD@+MJxzq<4]@$i) Vm^Wb?8g6 %qz^:FN3y|jzx31}o4'dPZS`7u}J?c~1^(!Vf4i5w0^a?cH6 zA,w7nwVh0xg mr:iiTi&G{t?9S_fvtnam*ed@mq,"ZIZ d` 0` T&b`^ HhUKK8x}4{@$5W^ACJ/`FtJE oY@8f-b9bCPCFiF.^`/a4 h:59?Rgs{a@Q%CZsz9} i!;|=}4Zgme d} 3&D R/qA4H+-PC@n{@x<`{A OWq$i(q".@hI;x20>*VeGI C|cHvhTUMi@hx]^Yuj<&Vlx )Lau)AyU#$bh?z#6^527`}+g5oY;*L{24"[>S/;xB%` "3 :s !ccS:KJrTu|H2d+u(Aa0,&{*:_Z5'$"o"G4J.8[h[%(Y{ <}!$!-A4[> id)@F<]0!4}orm8tCh T"e>^eM$ 9PK.:A0|>R'pj0@<"Dktf5*7&0dan Kw G (%?< {b0" HL v=%`\ Z)VdVH ?lw[.-|^f -u`Av`R- s A4<0PwK"v!=WF`SAE /.J5'J$N Jp{[Yh,|g,{t~usnrOk^+Byv}Ur(,t@]*blQ,Q_Zc|7IjK9hTCn$rTN1:dYW0d!aUD7`x@r+/ <=y, ^3bP ]P x)[1pwKeE0r=Ma{laUW,+[*lhARhfVjw7Wn+VN HRU=DC#gd(f6]_q,&".XG=ei^|H9m;|&bWvT@`7hI{N2 PeGxX"*KIeLar sbF.v/hs{w0* Q17u! M O3`=;g 7bG"L`zVo1V{@y? cD""v$2|z4 Dq@IL~ JsHcZD%KwYSR7JV jVsc6="EQ@ <#IF4L6f1.I0T[L0B/BG(c,=.YJx/+wBa'ZRjp0wWovO1&I{PYo)h!_a1VEYq&vLNtsL!2+qX4V$B `cL.Y(3H4['.d3PkC?2'nCfn>Q[@38l@x 8)>s"Q@y>^NPe:f&.jR=,bF2F*@|bk3f ]+@Lu(&?ABVsJIrIV7P Od$)dfWbU!4,I{){0^*"q6S|bx%E(vCEMxhPp+ % Ia} q J3=>: y0-k/-u7}'|7S17 r{vVYGHv%YYFdL 2HQsLeJ'8@LU[J!E1/_ 5:S[:^//znZO'@R6A~ F&:Tm&d~}rHUppARo}8'n]WDN]BUe'0zg`GiPyr]|&^ZG=w:aa O b=Fv@YP`$BvKvOHvu*]d7p9A3}wHE(0 3i'0S[M#4`Gx)69jK,Cn@ZTs9t#F;@HAbb?2Ps"|+)1O0dGhZ(Gpg=`vF{jH4V&GwlFr6s&GxH[g#.i`jh$q5epj%ECp;mcn $#}/0Wrs~D|lzxT- T;? 1X$xVFe|'Tp6p3 H?&j##h#E9:[f`G s}a|L'rFt^3%(``ny86SAp^gGx`aBeMG6z&f.HoUd0-JO_Wr;^ doQ]qlX H7Q[hUiJ6lJX*P0@0Kl^E>pT;2YT#&$z590a^*y mSMvjXP8BYmtiuns&B ,RL^|s uqu+Km^Cmdp])*I5kW|,UA}gOL!${0z.cr[|.#R#6'TRhSPCCM1)L4C.Bz(> =8YD):R$Qql('%J{x:*'07GA;TPN,8L6%fFD$G#A(G ;MGSU$#(k[1"#NMZ@bRDLtA)K(04V'}xSIz p !`Ag!t0Rgr O RvC:=&u ;u"a4.] ]hQrm:8= P v]|^/v1%Jz}czyBb&MIVG;W0ZOld~A%i 7WK5~zzgoj>^m]s7d/U+!ZZPw6W R$}?(6')UTH'IyOzJ)BcXF5d&%byz'}(SFsO3{I+w|OAhXxc4yc%J:2-./-6,2JfjXxL{ y/) nk6]jL 1aJuD(0&1n%C=p]oyf8&$yN7iapOK47,J)-F6uj>r?h)O @+Btr&U 4]!>hv6J5td".Ja4 TLr#%SeY>qSpIcqi,i~"zDU7P$_yN Btw@#`:0A.$=8m.06X;SU}" ' O$g&nwN&'lm*emq=x@B1XLAW)f_M6c| L8uXVQIlB$n|= &Q_?)pTEe~1a0&g N &tw#cj-L%%D h.^Xo_M}V"rHo@%{C8(J%F19ma7ZadRJ7PC]MVz12aOw^W;YUOsB,0}_z=Ol2[=rGY?iM|4,Zo vn_Fc._@!V$>PGAk5E,o0#8xnm'e^9eN`4%Q{4_P jdNbLI[Z`7V}V6u(Z0wZ+(}b`ob7I(^Jdir:~{S*5iSj53P<33h&rx6BR&X $mpf"@U/V)j++-bN629L~afqo*L`<< `5#$gfJoDJu}!B%:e$~"/*? XZIo#TJ*13/c!xcj6[Yk5_Js$k%cTA_LsWM]F`gThh4Ns^$uTf1x bcgls[?BUf:bm p;'kZ>h.TnGU7Y)<5;f>k:$Tj)Z^Ke_$N%;qVyN*'7~Hk e;+K">J,3/|-1z4>+jy^S!qE`/x+De2 p01C~ZUx$! 0/E$ p/Xs*_0B7lA09c/u$GdI4*p#mB#47DjI3U;*0)>R))fXBUd@+C);&(O*TG8xqA03K"q.%'` q$Q%(0,P#'8 'L]b%e4>@g>uJK.V*Q^u}"P rG&h[T%:0SkFm9==g+Q u"#{Fj74 :Nusdb%tG r32~|QSV[6HfDH m(M( / 5i*OQI'k+8U7j;Dn7 z.hS6g"c:uej`r{+fY*L+]%&Y !EBrWxRzw1_b[Y+2Iw-Kq+dwz +SW TkIVH7WNd2"R]/Mj!7, $*uRpg=(O~ $9fjJt$mvVXgf2|UFxoFb@xbLEz%-8@#x)Bh7z;B]L=!&(kQ`NvAJ&"A[$F/kT88:&n6;d#NHI8t@=QQOC; 13 [JS/.[)S%Cc%eT@;$^e <[Su.#LfP^Ahmstr6sRA5vsros2gxOe~F,v0EzJqjfCPOuJ9'&_A[ySej7;[e$tehB5_`Cc0d H(U}u_wq'"XK4^Y]r8LGVY_b_nM 3 `l'p >lVcYcy$:8B3+G.h

Original post:

These are the most breathtaking astronomy photographs of the year ... - Evening Standard

Believe it or not, these planetary pictures were taken from Earth! – SYFY WIRE (blog)

After those two huge solar eclipse posts this week (glad you asked: they are here and here), how about a bit of planetary eye candy for you?

If you go outside tonight after sunset and look to the sky, you might notice a bright "star" high to the southwest (for Northern Hemisphere observers). That's the planet Jupiter. If you turn and look southeast you can also see Saturn, another bright star-like shining object.

To the eye, they are unresolved, just dots. But to a powerful telescope in the right location, they are glorious. Behold!

Holy wow! That, me droogs, is Jupiter and its moon Ganymede. And that image was taken not by a space probe orbiting the giant planet, but by a telescope right here on Earth!

It was taken on June 10, 2017, using a 1-meter telescope at the Pic du Midi observatory, one of the best (if not the best) spots on Earth to observe the planets. The observatory is in the French Pyrenees, and has very stable air around it. Unsteady atmospheric conditions blur out small details (astronomers confusingly call this seeing), but the smooth flow around Pic du Midi means really high-res images can be taken.

That image, and all the others in this article, were taken as part of a professional-amateur collaboration, in which truly advanced and expert amateur astronomers get access to the scopes and then process the images. The scientific purpose is to keep track of the outer planets (Jupiter, Saturn, Uranus, and Neptune) and Venus, monitoring their atmospheric patterns and wind speeds to aid spacecraft sent to investigate them up close.

But more than that, this sort of collaboration is a meeting of brains, a sharing of experience, that allows everyone to learn from each others efforts. The image above was processed by master planetary astrophotographer Damian Peach. The details are astonishing; you can see whorls of turbulence between the dark belts and light zones, individual storms thousands of kilometers across in the southern belt, and of course the Great Red Spot about to rotate out of view on the right.

Note Ganymede in the upper left: Thats Jupiters biggest moonand, indeed, the biggest moon in the solar system. Its comfortably bigger than Mercury, and if Jupiter werent there we might consider it a planet in its own right! You can see detail on the surface of this rocky, icy world; note how dark it is, punctuated with spots of brighter ice. Compare it to a map made from Galileo and Voyager images the bright spot to the lower right is the impact crater Osiris.

In this image, Damian processed Ganymede differently and then created a final composite moving Ganymede in closer to the planet so its easier to see. Another of the astronomers on the team, Emil Kraaikamp, processed one of the images taken a bit later, keeping Ganymede in its correct spot relative to its home planet:

Note how far it is! And also note that the Red Spot has rotated a bit to the east, and is closer to the planets limb.

They also created a gorgeous animation of the planet rotating using infrared light:

Wow. You can see Jupiters cloud patterns change subtly, and Ganymede move in its week-long orbit around the massive planet.

I already wrote about the image they took of Saturn a few hours later that same night, and its just as stunning. I havent seen the images of Neptune or Uranus yet, though. However, later that night, before sunrise, they caught Venus rising in the east:

The animation was made from two images taken about 25 minutes apart, and you can see some movement in that time. The images were in the ultraviolet; using visible light (the kind we see) Venus is almost featureless, but the clouds reflect ultraviolet sunlight differently, and more interesting things can be seen. Venus rotates slowly, taking 243 Earth days to spin once, but the atmosphere rotates faster than that. This is called superrotation, and is what causes that huge chevron-shaped feature in the clouds.

What wonderful images! Such a delight for the eyes and brain, but also for the science itself. To think that we can achieve such results from Earth, tens if not hundreds of millions of kilometers from the target planets. And in three cases (Venus, Jupiter, and Saturn), its done to support probes we have physically orbiting those bodies! And who knows? Maybe, in the next few years,well send more spacecraft to Uranus and Neptune.

Im very glad to see this teamwork out of Pic du Midi. Its a lovely example of collaboration, which is in many ways what science is all about.

Original post:

Believe it or not, these planetary pictures were taken from Earth! - SYFY WIRE (blog)

Smith astronomer presents rare images of stars at national conference – GazetteNET

NORTHAMPTON When Smith College astronomy professor James Lowenthal got images back from the Hubble Space Telescope this year, his initial response was simple: Wow!

What he was looking at were the brightest infrared galaxies in the universe close-up views of rare, ultrabright collections of stars from the early universe that are furiously producing even more stars. Those views, Lowenthal told the Gazette at his office on Tuesday, may someday help answer a fundamental question about the history of the cosmos: how did galaxies form and evolve?

The images Lowenthal was observing made use of a well-known effect called gravitational lensing. Essentially, the light from those 22 distant galaxies passes through the gravitational field of a closer massive object, which acts as a kind of cosmic magnifying glass for researchers on Earth.

That foregrounded, natural lens allows astronomers to see otherwise impossible-to-see pictures of the distant universe. Light traveling from those galaxies takes billions of years to reach Earth, so researchers are quite literally looking into the past at galaxies from as long as 12 billion years ago about 90 percent of the way back to the Big Bang, according to Lowenthal.

Lowenthal presented those images at the American Astronomical Society meeting in Austin, Texas, last month.

The reaction has been in our scientific community, This is so, so cool, Lowenthal said of the response from his colleagues.

But before Lowenthal could take that peek into the past with his fellow researchers including Min Yun, Kevin Harrington, Patrick Kamieneski and Daniel Wang of the University of Massachusetts Amherst they had to write a scientifically rigorous proposal laying out their case for getting highly sought- after time on the Hubble telescope.

We convinced them it would be really cool, Lowenthal said of the proposal. And wow! It was really cool.

Lowenthal said Yun and others cleverly discovered the galaxies by using publicly available data from several telescopes, and used the Large Millimeter Telescope a joint project between UMass and Mexicos National Institute of Astrophysics, Optics and Electronics to confirm their distances from Earth.

It was thanks to that work narrowing down a list of distant galaxies that the team knew where to look when they got time on the Hubble telescope.

The distant galaxies in the Hubble images are producing 5,000 to 10,000 times more stars than the Milky Way, but are using the same amount of gas contained in the Milky Way. That fact leaves astronomers to puzzle over what exactly is fueling that star birth.

Possible explanations for the rapid creation of stars could be the collision of massive galaxies, a flood of gas or something entirely different. At issue is the very nature of galaxy formation and evolution.

Those are lingering questions that Lowenthal hopes to answer, but first the images from the Hubble telescope must be decoded.

While gravitational lensing makes those distant galaxies more visible in high detail, it also bends their light, leaving warped images with streaks, circles and arcs that can leave researchers unclear about what exactly theyre looking at. The task now is to unscramble those pictures.

To explain the warping of the images, Lowenthal used the analogy of looking at candlelight through a wine glass. The light will appear in different spots, or even stretch across the bottom of the glass in a circle, depending on how the glass is held.

Because the images theyve received are warped, researchers must now work backwards to reconstruct what those galaxies actually looked like before passing through the lens. Knowing the distance of those galaxies, Lowenthal and others must figure out other variables like the gravitational pull of the lens to model what the original image looked like, or to even figure out what the background and foreground are.

From Hubble, we got only monochromatic, black and white images. Its only one wavelength, Lowenthal said, noting that hes hoping to get images from Hubble in the future that will show colors like red and blue. If we did have that information, it would tremendously, instantly help us separate foreground from background, because the foreground and background are almost always different colors.

Lowenthal and his colleagues failed to get approval to use the Hubble telescope during the latest cycle of proposals, but he said he hopes theyll soon have access again, and they hope to gain further insight into the nature of those early galaxies.

While he waits for more data, however, the images Lowenthal already has have nevertheless changed his perception of the cosmos in at least some way. As a scientist who normally studies distant galaxies without much emphasis on gravitational lensing, the new images have made him rethink the galaxies he has been looking at for so many years.

I have not been thinking, Most of those galaxies are probably gravitationally lensed, at least a little bit, Lowenthal said. And now Im thinking, Everything is lensed!

Its definitely startling to have a big shift like that, he said, though the smile on his face and wonder in his eyes seemed to indicate he was far more excited and curious for the work ahead than startled.

Read the original here:

Smith astronomer presents rare images of stars at national conference - GazetteNET

UK Astronomy Professor Discusses What To Expect On Eclipse Day – LEX18 Lexington KY News

LEXINGTON, Ky (LEX 18) With the solar eclipse just around the corner, a UK astronomy professor who has traveled the world to see solar eclipses sat down with LEX 18 to describe what people should expect to see on August 21.

"A total eclipse of the sun is an event in which the moon comes between the sun and the Earth. And the moon casts its shadow on the Earth," said UK Physics and Astronomy Professor, Tom Troland.

He said to see the total solar eclipse, you have to be in the path of totality.

"It is a path across the face of the Earth, about 100 miles wide and thousands of miles long. That path of totality passes across the entire continental United States," he said.

The path of totality goes from Oregon to South Carolina, going through Western Kentucky but not through Lexington. The professor says Lexington will be able to see part of the eclipsethough.

"In Lexington, the sun will only be 95% covered at the maximum eclipse time, which is about 2:30 in the afternoon on August 21st," he said.

And he said you don't need to be an astronomer to appreciate the total solar eclipse.

"If you're a sensing human being, if you have some sense of the beauty of nature, you'll never forget what you see with a total solar eclipse," he said.

Continue reading here:

UK Astronomy Professor Discusses What To Expect On Eclipse Day - LEX18 Lexington KY News

Astrophysics, Galaxy Clusters and the Key to the Universe – Hamilton College News

During the fall of 2016, Anya Nugent 18 began looking into researchers at Lawrence Berkeley National Laboratory (Berkeley Lab) who were doing work in astrophysics or cosmology.

Berkeley Lab, where Nugent also worked last summer, is a member of the national lab system supported by the U.S Department of Energy, and conducts research across a vast range of scientific disciplines.

Technologies developed at Berkeley have generated thousands of jobs, and billions of dollars in revenue.

This summer, Nugent is participating inBerkeley Lab Undergraduate Research (BLUR) program to study galaxy clustering. After contacting Shirley Ho, the senior scientist at Berkeley Lab, Nugent was put in contact with Hos post-doc, Chamberlain and Einstein Fellow Zachary Slepian.

Slepian, Nugents mentor, and Nugent discussed several possible research projects to pursue over the summer, but eventually decided to study galaxy clustering as a way to learn more about dark energy and General Relativity. Though we know that dark energy and gravity affect how distances between objects change with time, we still do not completely understand their fundamental properties. By studying galaxy clustering, we can expand our knowledge of these topics, which is key to comprehending how our universe works, she said.

To measure galaxy clustering, Slepian and Nugent are using a three-point correlation function (3PCF), which examines triangles formed by galaxy triplets by measuring two triangle sides and the angle between them. Traditionally, the 3PCF has been too computationally complex to measure. However, Slepian discovered a new way of analyzing galaxy triplets, which, in turn, altered the scale of the function, thus making it less complex. A group at the National Energy Research Scientific Computing Center (NERSC) was able to make a code for this innovative algorithm, which will soon be used to analyze the results from the 3PCF and galaxy clustering.

Anya Nugent 18

Concentration: physics and Hispanic studies double major

Hometown: Orinda, Calif.

High School: Campolindo High School

Before the code can be used for scientific purposes, the team must implement an edge correction code and a method for weighting data correctly, which is Nugents part of the project. The edge correction code will adjust jagged astronomical survey boundaries, which normally would negatively affect the results.

The work Nugent is doing at BLUR will culminate with a paper and presentation at the end of the summer, but her research concerning the 3PCF will not end there. Once weve finished these codes, we can start running them on astronomical surveys and simulations so we can analyze the 3PCF and galaxy clustering. This is research I will be doing in the spring for my thesis, said Nugent.

Read more here:

Astrophysics, Galaxy Clusters and the Key to the Universe - Hamilton College News

‘Dr Yash Pal Singh simplified science for the masses’ – Hindustan Times

Education should be based on real life experiences, observations and happenings, is a simple thought that renowned Indian scientist Dr Yash Pal Singh left behind, after his unfortunate demise on Tuesday.

A Padma Vibhushan recipient, who is a globally acclaimed physicist, scholar and an education reformer, died at the age of 90, on July 25, in Noida, Uttar Pradesh. The cause of death is unknown.

Across the expanse of his illustrious career, Pal has made significant contributions to the field of science and to the study of cosmic rays, astrophysics, high-energy physics. Prominent individuals from Pune share their grief on his demise.

His scientific contributions are unparalleled of course, but what sets him apart is his ability to communicate science and reach out to the masses. Through television and different mediums, he really simplified science and made it accessible to all. Such a contribution to the field of science and education is very rare. With his death, India has lost a great soul, saidDr Prabhakar Ingle, head, science communication, CSIR-NCL, Pune.

Former chairman of UGC, Arun Nigavekar, spoke of his encounter with Pal who held chairmanship of the UGC from 1986 to 1991. After he finished the chairmanship,I was made the member of UGC and since then we worked together in many committees.The most monumental work that he left behind, the effect of which would be quite lasting, is his report on renovation and rejuvenation of higher education in India submitted to the Ministry of Human Resource Development on June 24, 2009. He fought the landmark case and won, against fake private universities of Chhattisgarh. He did most of the ground work for the case, compiled data and developed the arguments for the case. This was just one contribution among many, and withhis demise we have lost a great human being, he said.

Further, expressing his grief and sharing his personal experience with the scholar, Nitin Karmalkar, vice chancellor of Savitribai Phule Pune University,responded, I had an opportunity to personally travel with him, and meet him, when he had come to inaugurate the electronic science department some 20-25 years ago. At that time, I was a faculty, and was in awe to find such a humble person in an internationally acclaimed scholar like him. Despite his exemplary credentials, he would humbly mix with children of all ages to explain to them the complexities of science in the simplest of ways. He was the pioneer of sensitising, simplifying and popularising scientific education for all. Indeed, India has lost a great mind and a scientific stalwart.

Dr Pal, was also on the advisory committee of Flame University, and the current vice chancellor of the varsity, Dr Devi Singh, expressed,I had the fortune to know him personally and work with him. He would always bring something new, something out-of-the-box to the table, every time. He had a phenomenal contribution to higher education, and is a very respected man around all our faculties and together we grieve this loss.

His last rites were conducted on Tuesday, at 3 pm.

BOX:

Yash Pal attended the Massachusetts Institute of Technology for his PhD

In 2000, received the Indira Gandhi Prize for Popularization of Science

In 2006, received the Meghnad Saha Medal

In 2009, he received the Kalinga Prize, awarded by UNESCO for the popularisation of science

Served as the Chancellor of Jawaharlal Nehru University, New Delhi from 2007 to 2012

He was awarded the Padma Vibhushan in 2013

See the original post:

'Dr Yash Pal Singh simplified science for the masses' - Hindustan Times

Krishna Rajagopal named dean for digital learning – MIT News

Krishna Rajagopal, the William A.M. Burden Professor of Physics and former chair of the MIT faculty, has been named dean for digital learning, effective Sept. 1. This new position expands leadership roles for faculty within the Office of the Vice President for Open Learning, which recently launched the MIT Integrated Learning Initiative and the Abdul Latif Jameel World Education Laboratory.

As dean for digital learning, Rajagopal will lead efforts to empower MIT faculty to use digital technologies to augment and transform how they teach. He is charged with building and strengthening connections between academic departments and the Office of Vice President for Open Learning, to facilitate broad-based engagement and bottom-up change. Rajagopal will catalyze, promote, and disseminate faculty innovations in MIT residential education, and, he will continue to support the sharing of a broad range of MIT knowledge and perspectives with learners around the globe.

Within the Office of the Vice President for Open Learning, Residential Education, MITx, OpenCourseWare, and the Digital Learning Lab will report to Rajagopal under the leadership of Sanjay Sarma, vice president for open learning, who made the announcement today. Rajagopal will work with Sarma and Senior Associate Dean of Digital Learning Isaac Chuang on the offices strategy and organization. As a member of Academic Council, Rajagopal will provide advice and perspectives to MIT President L. Rafael Reif and the senior administration.

Krishna combines his stellar research career with a passion for improving teaching and learning and a remarkable ability to integrate diverse points of views into a unifying vision, Sarma says. In a time of significant changes in education, I am confident that Krishna will offer great guidance for our open learning initiatives. He will work to maintain and enhance MITs position as a leader in providing access to high-quality education around the world, and he will continue to improve teaching at MIT.

As chair of the MIT faculty, Rajagopal distinguished himself as a strong advocate for the faculty. He was known for his listening skills, inclusive style, and ability to help colleagues and departments optimize and achieve their goals, including those involving the development and launch of new educational pathways for MITs students.

Some of his accomplishments as former chair of the faculty include joining with Dennis Freeman, then dean of undergraduate education, to assemble a group of faculty from MITs five schools, which conducted an in-depth study of the role of algorithmic reasoning and computational thinking in the context of the education of MIT undergraduates. He was also responsible for the charging of the Faculty Policy Committee Sub-Committee on Sub-Term Subjects and the subsequent implementation of many of its recommendations; building a new faculty governance website; and leading efforts in the creation of MITs new Master of Applied Science (MASc) degree, an umbrella degree type introduced in fall 2016 for one-year professional masters degrees that include a capstone project.

Previously, Rajagopal served as associate head for education in the Department of Physics, where he stewarded the department's undergraduate and graduate educational programs and became known for his dedication to students. In that role, he facilitated and supported new MITx activities that improved the on-campus teaching of freshman physics and junior lab, as well as the first massive open online courses (MOOCs) on intermediate quantum mechanics and advanced quantum field theory.

I am excited about this new challenge, as I will be helping MIT faculty members take their passions for teaching and learning to new levels in ways that can have long-lasting impact across MIT and around the world, Rajagopal says. Our digital learning efforts already reach thousands of students in MIT classrooms and millions of learners around the world. What makes this an exciting time for education is that as these technologies, as well as research on how people learn, evolve, they are transforming how we teach today, and will do so in ways that we cannot yet see and must invent.

Since joining the MIT faculty in 1997, Rajagopal has produced a significant body of research in theoretical physics focused largely on how quarks ordinarily confined within protons and neutrons behave in extraordinary conditions such as the hot quark soup that filled the microseconds-old universe, conditions that provide a test bed for understanding how a complex world emerges from simple underlying laws. His work links nuclear and particle physics, condensed matter physics, astrophysics, and string theory.

Rajagopal is the author of about 100 papers that have been cited more than 16,000 times, and has mentored more than two dozen PhD students and postdocs. He was elected a fellow of the American Physical Society in 2004. He is a Margaret MacVicar Faculty Fellow and won the Everett Moore Baker Award for Excellence in Undergraduate Teaching in 2011 and the Buechner Prize for Excellence in Teaching in 1999.

Rajagopal grew up in suburban Toronto; his family moved there from Munich when he was less than 1 year old. Influenced by an outstanding teacher who brought pioneering advances in recombinant DNA and molecular biology into his public high school biology class, Rajagopal arrived at Queens University in Kingston, Ontario, planning to major in biology. His freshman physics class rekindled his earlier interest in physics, and he says he much appreciates the formative educational influences that shaped his own experience.

He graduated from Queens in 1988 and completed his PhD at Princeton University in 1993. After stints as a junior fellow at Harvard University and a Fairchild Fellow at Caltech he joined the MIT faculty in 1997. Rajagopal has spent one year each at the University of California at Berkeley and at CERN, the physics laboratory outside Geneva, Switzerland. He lives in Arlington, Massachusetts, with his wife and two sons.

Continue reading here:

Krishna Rajagopal named dean for digital learning - MIT News

Musk vs. Zuck – The Fracas Over Artificial Intelligence. Where Do You Stand? – HuffPost

Advances in Artificial Intelligence (AI) have dominated both tech and business stories this year. Industry heavyweights such as Stephen Hawking and Bill Gates have famously voiced their concern with blindly rushing into AI without thinking about the consequences.

AI has already proven that it has the power to outsmart humans. IBM Watson famously destroyed human opponents at a game ofJeopardy, and a Google computer beat the world champion of the Chinese board game,Go.

Google's AI team are taking no chances after revealing that they are developing a 'big red button' to switch off systems if they pose a threat to humans. In fact scientists at Google DeepMind and Oxford University have revealed their plan to prevent a doomsday scenario in their paper titledSafely Interruptible Agents.

Truth is indeed stranger than fiction and tech fans could be forgiven for nearly choking on their cornflakes this morning after hearing about a very public disagreement between the two tech billionaires. The argument is probably a good reflection of how people on both sides of the aisle feel about heading into the foggy world of AI.

In one corner, we have Mark Zuckerberg who believes AI will massively improve the human condition. Some say he is more focused on his global traffic dominance and short-term profits than the fate of humanity. Whatever your opinion, he does represent a sanguine view of futuristic technologies such as AI.

In the other corner, we have Tesla's Elon Musk who seems to be more aware of the impact our actions might have on future generations. Musk appears concerned that once the Pandora's box has been cracked open, we could unwittingly be creating a dystopian future.

Zuckerberg landed the first punch in a Facebook Live broadcast when he said

However, Elon Musk calmly retaliated by landing a virtual uppercut by tweeting "I've talked to Mark about this. His understanding of the subject is limited."

Whether you side with Musk and believe that AI will represent humanity's biggest existential threat or think Zuckerberg is closer to the truth when he said, AI is going to make our lives better, your view is entirely subjective at this point.

However, given the range of opinions around this topic, should we be taking the future of AI more seriously than we do today?

I will tell you that big businesses with large volumes of data are falling over themselves trying to install machine learning and AI driven solutions. However, right now, many of these AI driven systems are also the source of our biggest frustrations as consumers.

Are businesses guilty of rushing into AI based solutions without thinking of the bigger picture? There are several examples of things going awry like the Chat bots claiming to be a real person, or the spread of fake news, or being told you are not eligible for a mortgage because a computer says so.

There are also an increasing number of stories about AI not being quite as smart as some would believe it to be, or how often algorithms are getting it wrong or being designed to deceive consumers. For every great tech story, there is a human story about creativity and emotional intelligence that a machine can never match.

Make no mistake the AI revolution is coming our way, and large corporations will harvest the benefits of cultivating their big data initiatives. Anything that will eliminate antiquated processes of the past and enable business efficiency can only be a giant leap forward.

However, the digital transformation of everything we know is not going to happen overnight. That does not mean we shouldn't be vigilant about how our actions today could affect future generations.

Mr. Zuckerberg may be accused by some of acting in the interests of his social media platform, and that is quite understandable. Beneath every noble statement resides a hidden interest it is safe to assume that nowadays, unless one is Mahatma Gandhi, Dr. Martin Luther King or Nelson Mandela.

On the other hand, there are also the likes of Musk and Gates that are arguably looking beyond their own business interests.

I am no expert by any stretch of the imagination, but I do ask if we need more of us to question how advancements in technology are providing advantages for the few rather than the many?

Lets build on Elon Musks point of view for a moment. I wonder if we should be concerned that a dystopian future awaits us on the horizon? Will the machines rise and turn on their masters?

AI is no longer merely a concept from a science fiction movie. The future is now. The reality is that businesses need to harness this new technology to secure a preemptive competitive advantage. Time-consuming, laborious and automatable tasks can be performed better and faster by machines that continuously learn, adapt and improve.

The current advances in technology have unexpected parallels with the industrial revolution that helped deliver new manufacturing processes. 200 years ago, the transition from an agricultural society to one based on the manufacture of goods and services dramatically increased the speed of progress.

Steel and iron replaced manual labor with mechanized mass production hundreds of years ago. That is not unlike the circumstances facing businesses today. The reality is that as old skills or roles slowly fade away, there will be a massive shortage of other skills and new roles relevant to the digital age.

Ultimately, we have a desire to use technology to change the world for the better in the same way that the industrial revolution changed the landscape of the world forever. The biggest problems surrounding market demand and real world needs could all be resolved by a new generation of AI hardware, software, and algorithms.

After years of collecting vast quantities of data, we are currently drowning in a sea of information. If self-learning and intelligent machines can turn this into actionable knowledge, then we are on the right path to progress. Upon closer inspection, the opportunities around climate modeling and complex disease analysis also illustrate how we should be excited rather than afraid of the possibilities.

The flip side of this is the understanding that no thing is entirely one thing. The risks versus rewards evaluation and the fact that researchers are talking about worst case scenarios should be a positive thing. I would be more concerned if the likes of Facebook, Google, Microsoft and IBM rushed in blindly without thinking about the consequences of their actions. Erring on the side of caution is a good thing, right?

Demis Hassabis is the man behind the AI research start-up, DeepMind, which he co-founded in 2010 withShane LeggandMustafa Suleyman.DeepMind was bought by Google in 2014. Demis reassuringly told the UK's Guardian newspaper:

It would appear that all bases are being covered and we should refrain from entering panic mode.

The only question the paper does not answer is what would happen if the robots were to discover that we are trying to disable their access or shut them down? Maybe the self-aware machine could change the programming of the infamous Red Button. But that kind of crazy talk is confined to Hollywood movies, isnt it? Lets hope so for the sake of the human race.

Those of us that have been exasperated by Facebook's algorithm repeatedly showing posts from three days ago on their timelines will tell you that much of this technology is still in its infancy.

Although we are a long way to go before AI can live up to the hype, we should nevertheless be mindful of what could happen in a couple decades.

Despite the internet mle over the impact of AI between the two most powerful tech CEOs of our generation, I suspect like anything in life, the sweet spot is probably somewhere in the middle of these two contrasting opinions.

Are you nervous or optimistic about heading into a self-learning AI-centric world?

The Morning Email

Wake up to the day's most important news.

Continued here:

Musk vs. Zuck - The Fracas Over Artificial Intelligence. Where Do You Stand? - HuffPost

Artificial intelligence is not as smart as you (or Elon Musk) think … – TechCrunch

In March 2016, DeepMinds AlphaGobeat Lee Sedol, who at the time was the best human Go player in the world. It represented one of those defining technological moments like IBMs Deep Blue beating chess champion Garry Kasparov, or even IBM Watson beating the worlds greatest Jeopardy! champions in 2011.

Yet these victories, as mind-blowing as they seemed to be, were more about training algorithms and using brute-force computational strength than any real intelligence. Former MIT robotics professor Rodney Brooks, who was one of the founders of iRobot and later Rethink Robotics, reminded us at the TechCrunch Robotics Session at MIT last week that training an algorithm to play a difficult strategy game isnt intelligence, at least as we think about it with humans.

He explained that as strong as AlphaGo was at its given task, it actually couldnt do anything else but play Go on a standard 19 x 19 board. He relayed a story that while speaking to the DeepMind team in London recently, he asked them what would have happened if they had changed the size of the board to 29 x 29, and the AlphaGo team admitted to him that had there been even a slight change to the size of the board, we would have been dead.

I think people see how well [an algorithm] performs at one task and they think it can do all the things around that, and it cant, Brooks explained.

As Kasparov pointed out in an interview with Devin Coldewey at TechCrunch Disrupt in May, its one thing to design a computer to play chess at Grand Master level, but its another to call it intelligence in the pure sense. Its simply throwing computer power at a problem and letting a machine do what it does best.

In chess, machines dominate the game because of the brute force of calculation and they [could] crunch chess once the databases got big enough and hardware got fast enough and algorithms got smart enough, but there are still many things that humans understand. Machines dont have understanding. They dont recognize strategical patterns. Machines dont have purpose, Kasparov explained.

Gil Pratt, CEO at the Toyota Institute, a group inside Toyota working on artificial intelligence projects including household robots and autonomous cars, was interviewed at the TechCrunch Robotics Session, said that the fear we are hearing about from a wide range of people, including Elon Musk, who most recently called AI an existential threat to humanity, could stem from science-fiction dystopian descriptions of artificial intelligence run amok.

The deep learning systems we have, which is what sort of spurred all this stuff, are remarkable in how well we do given the particular tasks that we give them, but they are actually quite narrow and brittle in their scope. So I think its important to keep in context how good these systems are, and actually how bad they are too, and how long we have to go until these systems actually pose that kind of a threat [that Elon Musk and others talk about].

Brooks said in his TechCrunch Sessions: Robotics talk that there is a tendency for us to assume that if the algorithm can do x, it must be as smart as humans. Heres the reason that people including Elon make this mistake. When we see a person performing a task very well, we understand the competence [involved]. And I think they apply the same model to machine learning, he said.

Facebooks Mark Zuckerberg also criticized Musks comments, calling them pretty irresponsible, in a Facebook Live broadcast on Sunday. Zuckerberg believes AI will ultimately improve our lives. Musk shot back later that Zuckerberg had a limited understanding of AI. (And on and on it goes.)

Its worth noting, however, that Musk isnt alone in this thinking. Physicist Stephen Hawking and philosopher Nick Bostrom also have expressed reservations about the potential impact of AI on humankind but chances are they are talking about a more generalized artificial intelligence being studied in labs at the likes of Facebook AI Research, DeepMind and Maluuba, rather than the more narrow AI we are seeing today.

Brooks pointed out that many of these detractors dont actually work in AI, and suggested they dont understand just how difficult it is to solve each problem. There are quite a few people out there who say that AI is an existential threat Stephen Hawking, [Martin Rees], the Astronomer Royal of Great Britaina few other people and they share a common thread in that they dont work in AI themselves. Brooks went onto say, For those of us who do work in AI, we understand how hard it is to get anything to actually work through product level.

Part of the problem stems from the fact that we are calling it artificial intelligence. It is not really like human intelligence at all, which Merriam Webster defines as the ability to learn or understand or to deal with new or trying situations.

Pascal Kaufmann, founder at Starmind, a startup that wants to help companies use collective human intelligence to find solutions to business problems, has been studying neuroscience for the past 15 years. He says the human brain and the computer operate differently and its a mistake to compare the two. The analogy that the brain is like a computer is a dangerous one, and blocks the progress of AI, he says.

Further, Kaufmann believes we wont advance our understanding of human intelligence if we think of it in technological terms. It is a misconception that [algorithms] works like a human brain. People fall in love with algorithms and think that you can describe the brain with algorithms and I think thats wrong, he said.

When things go wrong

There are in fact many cases of AI algorithms not being quite as smart as we might think. One infamous example of AI out of control was the Microsoft Tay chatbot, created by the Microsoft AI team last year. It took less than a day for the bot to learn to be racist.Experts say that it could happen to any AI system when bad examples are presented to it. In the case of Tay, it was manipulated by racist and other offensive language, and since it had been taught to learn and mirror that behavior, it soon ran out of the researchers control.

Awidely reported study conducted by researchers at Cornell University and the University of Wyoming found that it was fairly easy to fool algorithms that had been trained to identify pictures. The researchers found that when presented with what looked like scrambled nonsense to humans, algorithms would identify it as an everyday object like a school bus.

Whats not well understood, according to anMIT Tech Review article on the same research project, is why the algorithm can be fooled in the way the researchers found. What we know is that humans have learned to recognize whether something is a picture or nonsense, and algorithms analyzing pixels can apparently be subject to some manipulation.

Self-driving cars are even more complicated because there are things that humans understand when approaching certain situations that would be difficult to teach to a machine. In a long blog post on autonomous cars that Rodney Brooks wrote in January, he brings up a number of such situations, including how an autonomous car might approach a stop sign at a cross walk in a city neighborhood with an adult and child standing at the corner chatting.

The algorithm would probably be tuned to wait for the pedestrians to cross, but what if they had no intention of crossing because they were waiting for a school bus? A human driver could signal to the pedestrians to go, and they in turn could wave the car on, but a driverless car could potentially be stuck there endlessly waiting for the pair to cross because they have no understanding of these uniquely human signals, he wrote.

Each of these examples show just how far we have to go with artificial intelligence algorithms. Should researchers ever become more successful at developing generalized AI, this could change,but for now there are things that humans can do easily that are much more difficult to teach an algorithm, precisely because we are not limited in our learning to a set of defined tasks.

Read this article:

Artificial intelligence is not as smart as you (or Elon Musk) think ... - TechCrunch

Roadwork gets techie: Drones, artificial intelligence creep into the road construction industry – The Mercury News

High above the Balfour interchange on State Route 4 in Brentwood, a drone buzzes, its sensors keeping a close watch on the volumes of earth being moved to make way for a new highway bypass. In Pittsburg, a camera perched on the dash of car driving through city streets periodically snaps pictures of potholes and cracks in the pavement. And, at the corner of Harbor and School streets in the same city, another camera monitors pedestrians, cyclists and cars, where 13-year-oldJordyn Molton lost her life late last year after a truck struck her.

Although the types of technology and their goals differ, all three first-of-their-kind projects in Contra Costa County aim to offer improvements to the road construction and maintenance industry, which has lagged significantly behind other sectors when it comes to adopting new technology. Lack of investment stifled innovation, said John Bly, the vice president of the Northern California Engineering Contractors Association.

But, with the recent passage of SB1, a gas tax and transportation infrastructure funding bill, thats all set to change, he said.

You may see some of these high-tech firms find new market niches because now you have billions of dollars going into transportation infrastructure and upgrades, he said. Thats coming real quick.

Its still so new that Bly was hard-pressed to think of other areas where drone and artificial intelligence software is being integrated into road construction work in the state. The pilot programs in the East Bay are cutting edge, he said.

At the Contra Costa Transportation Authority, Executive Director Randy Iwasaki has been pushing to experiment with emerging technology in the road construction and maintenance industry for several years. So, when the authoritys construction manager, Ivan Ramirez, came to him with an idea to use drones in its $74 million interchange project, Iwasaki was eager to try it.

We often complain we dont have enough money for transportation, Iwasaki said, adding that the use of drones at the interchange project in Brentwood would enable the authoritys contractors to save paper, save time and save money.

Thats because, traditionally, survey crews standing on the edge of the freeway would take measurements of the dirt each time its moved. The process is time consuming and hazardous, Ramirez said. But its only the tip of the iceberg when it comes to potential applications for the drones technology, which could also be used to perform inspections on poles or bridges and perform tasks people havent yet thought of.

As you begin to talk to people, then other ideas begin to emerge about where we might be going, and its propelling more ideas for the future, Ramirez said. By not having surveyors on the road, or not having to send an inspector up in a manlift way up high or into a confined space, not only is it more efficient, but it will provide safety improvements, as well.

Meanwhile, in Pittsburg, the city is working with RoadBotics on a pilot program to better manage its local roads. The company uses car-mounted cellphone cameras to snap photos of street conditions before running that data through artificial intelligence software to create color-coded maps showing which roads are in good shape, which need monitoring and which are in need of immediate repairs.

The companys goal is to make it easier for city officials to monitor and manage their roads, so small repairs dont turn into complete overhauls, said Mark DeSantis, the companys CEO. Representatives from Pittsburg did not respond to requests for comment.

The challenge of managing roads is not so much filling the little cracks, thats not much of a burden, DeSantissaid. The real challenge is when you have to repave the road completely. So, the idea is to see the features on the road and see which ones are predictive of roads that are about to fail.

At the same time, Charles Chung of Brisk Synergies is hoping to use cameras and artificial intelligence software in a different way seeing how the design of the road influences how drivers behave. At the corner of Harbor and School streets, the company installed a camera to watch how cars, cyclists and pedestrians move through the intersection and to identify why drivers might be speeding. In particular, the company is also trying to determine how effective crossing guards are at slowing down cars, he said.

It is still in the process of gathering data on that intersection and writing its report, but Chung said it was able to use the software in Toronto to document a 30 percent reduction in vehicle crashes after the city made changes to an intersection there. Before, documenting the need for changes would require special crews to either monitor the roads directly or watch footage from a video feed, both of which take time and personnel.

While only emerging in a handful of projects locally, these types of technology will become far more prevalent soon, said Bart Ney of Alta Vista Solutions, the construction-management firm using drones on the SR 4 project.

Were at the beginning of the wave, he said. Like any disruptive technology, there is a period when you have to embrace it and take it into the field and test it so it can achieve what its capable of. Were on the brink of that happening.

Originally posted here:

Roadwork gets techie: Drones, artificial intelligence creep into the road construction industry - The Mercury News

AI2 lists top artificial intelligence systems in its Visual Understanding Challenge – GeekWire

For AI2s Charades Challenge, visual systems had to recognize and classify a wide variety of daily activities in realistic videos. This is just a sampling of the videos. (AI2 Photos)

Some of the worlds top researchers in AI have proved their mettle by taking top honors in three challenges posed by the Seattle-based Allen Institute for Artificial Intelligence.

The institute, also known as AI2, was created by Microsoft co-founder Paul Allen in 2014 to blaze new trails in the field of artificial intelligence. One of AI2sprevious challenges tested the ability of AI platforms to answer eighth-grade-level science questions.

The three latest challenges focused on visual understanding that is, the ability of a computer program to navigate real-world environments and situations using synthetic vision and machine learning.

These arent merely academic exercises: Visual understanding is a must-have for AI applications ranging from self-driving cars to automated security monitoring to sociable robots.

More than a dozen teams signed up for the competitions, and the algorithms were judged based on their accuracy. Here are the three challenges and the results:

Charades Activity Challenge: Computer vision algorithms looked at videos of people performing everyday activities for example, drinking coffee, putting on shoes while sitting in a chair, or snuggling with a blanked on a couch while watching something on a laptop. One of the algorithms objectives were to classify all activity categories for a given video, even if two activities were happening at the same time. Another objective was to identify the time frames for all activities in a video.

Team Kinetics from Google DeepMind won the challenge on both counts. In a statement, AI2 said the challenge significantly raised state-of-the-art accuracy for human activity recognition.

THOR Challenge: The teams computer vision systems had to navigate through 30 nearly photorealistic virtual scenes of living rooms and kitchens to find a specified target object, such as a fork or an apple, based solely on visual input.

THORs top finisher was a team from National Tsing Hua University in Taiwan.

Textbook Question Answering Challenge: Computer algorithms were given a data set of textual and graphic information from a middle-school science curriculum, and then were asked to answer more than 26,000 questions about the content.

AI2 said the competition was exceptionally close, but the algorithm created by Monica Haurilet and Ziad Al-Halah from Germanys Karlsruhe Institute of Technology came out on top for text questions. Yi Tay and Anthony Luu from Nanyang Technological University in Singapore won the diagram-question challenge.

The challenge participants significantly improved state-of-the-art performance on TQAs text questions, while at the same time confirming the difficulty machine learning methods have answering questions posed with a diagram, AI2 said.

The top test scores are pretty good for an AI. But theyd be failing grades for a flesh-and-blood middle-schooler: 42 percent accuracy on the text-question exam, and 32 percent on the diagram-question test.

Representatives from the winning teams will join other AI researchers at a workshop planned for Wednesday during the 2017 Conference on Computer Vision and Pattern Recognition in Honolulu.

Read this article:

AI2 lists top artificial intelligence systems in its Visual Understanding Challenge - GeekWire