Will Amazon’s Deal With Whole Foods Boost Its Media & Retail Ecosystem? – Deadline

Amazon isnt just a master at selling stuff. Its also a shrewd buyer as its $13.7 billion agreement this morning to acquire Whole Foods demonstrates.

Usually, in a big deal like this one, investors respond to the news by driving down the acquirers shares. But Amazon ended the day up 3%.

Buyers are optimistic that CEO Jeff Bezos will boost Amazons revenues and reach by establishing a foothold in the nearly $1 trillion a year U.S. grocery business. That could be reason enough to justify the acquisition.

But the deal also raises intriguing possibilities for Amazon to expand scale, and deepen customer loyalty, for its entire sales ecosystem including its media offerings led by the Amazon Prime streaming video service.

The company should find it easy to dream up deals that motivate Whole Foods customers to sign up for Amazon Prime the $99 a year service that includes streaming video and music, as well as two-day delivery of most products Amazon sells.

The e-retailer might also tempt Prime members to pick up an Amazon Echo. The voice driven device answers questions, executes orders to fulfill digital demands including queuing up videos and play lists, and of course makes it easy for users to buy products from Amazon.

That could now include Whole Foods shopping lists. (Would it confuse Echo if Bezos was playful enough to offer meat eaters a cut called Amazon Prime?)

And, of course, the additional data about customers grocery likes would strengthen Amazons ability to charge high prices for targeted ads. Buyers could know, from peoples shopping histories, who might be a promising prospect to buy, say, a particular kind of cereal or dog food and craft sales pitches based on that intimate knowledge.

What may be more important for Bezos, though, are the delivery options open to him with 450 Whole Foods stores spread across 48 states. They will give Amazon outlets across the nations upscale neighborhoods.

Assuming the company doesnt limit Whole Food stores and the distribution network that serves them to groceries, then they can help address a nagging problem: speed. Bezos has long dreamed about adding same day delivery to a system that consumers find easy to use.

Analysts are betting that Amazon will make the deal pay off.

BMO Capital Markets Daniel Salmon says hes highly confident that Amazon will leverage the new store footprint for much more thanjust selling groceries.

Cowen & Cos John Blackledge says that grocery is Amazons biggest potential source of revenue upside over time and represents the continued evolution of Amazons multi-platform approach.

RBC Capital Markets Mark Mahaney warns that Amazon faces substantialexecution risk in the deal. But he isnt worried: For a company of Amazons size this is actually a relatively modest acquisition. As a result, its not an investment thesis changer for Amazon although the competitiveimplications for other grocers could be enormous.

Continue reading here:

Will Amazon's Deal With Whole Foods Boost Its Media & Retail Ecosystem? - Deadline

Interest in creating entrepreneur ecosystem – Wahpeton Daily News

The Breckenridge Port Authority learned about entrepreneurial ecosystems during their meeting Wednesday.

Justin Neppl, with the Small Business Development Center, Wahpeton, shared information about creating a business incubator in the region.

The board originally wanted to hire a part-time economic development consultant to bring new businesses to the city, and had identified Neppl as the logical choice.

Neppl said he would prefer to help in another way lead the development of an entrepreneurial network for the area.

Traditionally, economic development has been, you hire somebody and expect them to pick up the phone and bring businesses in, retain business and grow local businesses, he said. That expectation is very unrealistic. Having one person managing all that is a failure, plus ultimately entrepreneurs dont really care about economic developers, they care about other entrepreneurs.

He said that was the underlying theme of the book he brought along which was recommended by several people during his research, called Startup Communities: Building an Entrepreneurial Ecosystem inYour City, by Brad Feld.

Its what Fargo has done put up programs that attract other entrepreneurs. Economic development is still there and a necessary government position, but not in same way most think it should be, Neppl said.

He proposed instead of paying him, the board set a budget to create programs to connect entrepreneurs in the Twin Towns and surrounding areas.

I would go forth and have those same conversations with the Wahpeton EDC and CDC, (asking) that they would match the budget thats set, he explained.

He wouldnt try to duplicate existing efforts, such as Fargos One Million Cups, which meets weekly.

Its a fantastic place that a lot of entrepreneurs get to meet others, and hear pitches on business ideas and get feedback from the community. They also ran Startup Drinks for a long time, which was an after-hours thing.

He envisions the group meeting regularly to discuss a particular business topic or see a presentation and allow networking.

Working with the SBDC for the past three years has shown him how businesses get going, he said.

When a business wants to come to town or someone wants to start a business, theyre going to local entrepreneurs. Theyre getting together and solving problems, he said. If you had this group established, they would go to the entrepreneuers, now they have a mentor, and they would then be kicked back to the economic development office who can pair them with the right programs or answer questions they may have from a government standpoint.

Asked if the area would be at a disadvantage, competing with a larger communities like Fargo-Moorhead or Fergus Falls, Neppl said people and businesses dont care about boundaries, which are created by government.

Thats one of the things the book states and I agree 100 percent, he said. The more we can collaborate, the better off we are. We should be contacting Fergus (Falls) all the time and networking with their entrepreneurs over there and seeing how we can improve.

He also pointed to Battle Lake as an example of a community with a tight-knit group of business owners.

Its something we lack here, he said.

Board member Dennis Larson, also a Wilkin County commissioner, said the county board is interested in working with the city regarding economic development in some form, but was unsure how to explain the idea to them. He invited Neppl to a commissioners meeting for a presentation.

You have to get entrepreneurs excited first, and everyone else will come along. Thats where the money will be, so all of a sudden bankers will be interested, insurance agents, investors, and government. Once you excite the entrepreneurs, youre good to go, Neppl said. The Jay Schulers of the world would love something like this here. We need a platform that theyll attend and find value in and network.

Statistics show an average of 10 percent of a communitys population are entrepreneurs, Neppl said That would mean about 300 people in Breckenridge are potentially entrepreneurs or are interested in starting a business, and about about 1,100 people between the two towns.

Maybe our entire entrepreneurship percentage isnt as high as Fargo, but we could attract them. Theres an ability to add, he said. When you start sharing an idea with an entrepreneur, thats always a good thing. They can start networking for you to get the idea executed into a business.

See the article here:

Interest in creating entrepreneur ecosystem - Wahpeton Daily News

Are Open APIs the Stairway to the New Payments Ecosystem? – Finextra (blog)

Losing yourself in music is something that everyone should do at least once a day. The freedoms that it affords ones mind cannot be understated. In my eclectic (and eternal) playlist, there is one song that has likely been played at least twice as much as any other (and to be fair, it likely has the same global play counts); Stairway to Heaven by Led Zeppelin, an epic song that helped define the band to generations. But it was this past week when listening to the song when my mind was turning over thoughts on banking, and a line jumped out at me with new meaning; Theres a feeling I get, when I look to the west While Jimmy and Robert were musing over much larger questions about life and death, my current journey had me subconsciously thinking about banking.

We have all talked, and many with animated excitement, about where we are in our industry and the state of disruption and transformation of what ACI terms the New Payments Ecosystem. Many of us have agreed that change is upon us, and almost as many agree that no one knows for sure what that new landscape will look like. The immediate impact of Open API technology on banking is evident in the use and business cases that seem to emerge on a daily basis. This is partly because banks are forced to be more open by regulation such as PSD2, but also partly due to a need to be more open in order to form partnerships that deliver the new services our customers expect. We see the effects of these wider trends in the application of (not yet fully open) API technology to open up the payments schemes, and the move away from proprietary protocols and networks. Its not just the more established tech we have to get our heads around. Often over-hyped, but likely to have lasting-impact, are distributed ledger technology and blockchain, which will both thrive in this new open market. New technologies will drive drastic change in the industry. To quote a later line from the now firmly-planted earworm; And a new day will dawn, For those who stand long, And the forests will echo with laughter.

Financial institutions, retailers, technology providers, new entrants and incumbents have all arrived at this interesting crossroads in their respective journeys. The New Payments Ecosystem that is emergingand the opportunities it affordslook, feel and are different from the existing world. The roles that each participant played and fine-tuned over a generation are suddenly being challenged, augmented, adjusted or replaced, which is both scary and exciting. This disruption has everyone in the ecosystem looking closely at their approach to the market, realigning to the new opportunities that this ecosystem presents, and adjusting their business models to meet the new demands of this space.

Jimmy gives us all hope though; Yes, there are two paths you can go by, But in the long run, There's still time to change the road you're on. We are all on the early part of this journey, and we do have choices on the routes we are going to take, part of that is embracing the start-up mentality of agile development and fast-fail models. Its time for all of us to roll up our sleeves and start playing with new ideas, new technologies and embrace the journeyAnd who knows, maybe it will lead us to the magical staircaseto the promised land of banking that Led Zeppelin were clearly singing about!

If you want to talk 70s classic rock, or indeed Open APIs and digital banking, catch me at EBA Day 2017 on Stand 30.

Here is the original post:

Are Open APIs the Stairway to the New Payments Ecosystem? - Finextra (blog)

Persistent Memory Programming: The Current State of the Ecosystem – insideHPC

In this video from the MSST 2017 Mass Storage Conference, Andy Rudoff from Intel presents: Persistent Memory Programming: The Current State of the Ecosystem.

In this presentation, Andy will report on the latest developments around persistent memory programming. Hell describing current discussions in the SNIA NVM Programming Technical Work Group, the current state of operating system support, recent tool and library development, and finally hell describe some of the upcoming challenges for high performance persistent memory use.

Andy Rudoff is a Senior Principal Engineer at Intel Corporation, focusing on Non-Volatile Memory programming. He is a contributor to the SNIA NVM Programming Technical Work Group. His more than 30 years industry experience includes design and development work in operating systems, file systems, networking, and fault management at companies large and small, including Sun Microsystems and VMware. Andy has taught various Operating Systems classes over the years and is a co-author of the popular UNIX Network Programming text book.

See more talks in the MSST 2017 Video Gallery

Check our our insideHPC Events Calendar

See the rest here:

Persistent Memory Programming: The Current State of the Ecosystem - insideHPC

Report: Cris Cyborg vs. Megan Anderson in the works for 145-pound title fight at UFC 214 – MMAmania.com

UFC 214 is likely getting a womens featherweight title fight between Brazilian knockout artist Cris Cyborg and current Invicta FC champion Megan Anderson, per a recent report by MMA Fightings Ariel Helwani.

Cris Cyborg vs. Megan Anderson for the women's 145 title at UFC 214 is very close to being finalized, I'm told. Almost there. Looking good.

Remember, UFC just crowned the first ever womens 145-pound champion earlier this year when Germaine de Randamie defeated Holly Holm at UFC 208 to claim the title. But after GDR made it very clear that she has no plans of fighting Cyborg, the promotion has decided to go a different route.

Cyborg, 31, is widely considered the best fighter in womens MMA today. The former Invicta FC champion has been fed lackluster competition since making her UFC debut last year and deserves a chance to hoist promotional gold.

Anderson, 27, was able to take over Invicta FCs 145-pound division after Cyborg left for UFC. The Australia native is a very big featherweight (four inches taller than Cyborg) and would certainly serve as a formidable opponent for the Brazilian.

If this matchup does in fact get booked, de Randamie will most likely be stripped of her title, making her UFC 208 clash with Holm one of the more ridiculous title fights of all time.

UFC 214 will take place on July 29 live on pay-per-view (PPV) from inside Honda Center in Anaheim, California, and feature a main event clash between current UFC light heavyweight champion Daniel Cormier and former divisional kingpin Jon Jones.

For more UFC 214 fight card news click here.

Excerpt from:

Report: Cris Cyborg vs. Megan Anderson in the works for 145-pound title fight at UFC 214 - MMAmania.com

Sci-Fan Block: Cyborg, Ghost in the Shell and More! Review – The Nerd Stash (press release)

Sci-Fan Block is one of the many variations of the NerdBlock which brings you a mixture of sci-fi and fantasy in one box!

Cost: Starts at $19.99/month

What comes in the box?: Youll receive 4-6 fun collectibles & one exclusive T-shirt.

We are going to show you the contents of the March 2017 box, however, youve got 29 days to get in on the July Box (Preview Below).

The March 2017 box brought some pretty nice items. Lets take a look!

The first item was thisGhost in the Shell t-shirt. I actually really like the graphic on this shirt!

The next item was this Cyborg Martini Shaker. Im not a big Martini drinker myself but I think we could come up with a new drink.

Here is a Cyborg Bar Towel to go with your Martini Shaker. You may need it after you shake up your new concoction.

This item I wasnt truly impressed with the quality and would probably have no use for it myself. Its a Judge Dredd vinyl holder. I guess you could put money or an ID in it. It just seems more like one of those dollar storetoys that you kid thinks they must have and then it breaks or falls apart two days later. I like the idea but quality needs improvement.

This Wild West VS The Future Art Print fromWestworld is pretty cool with a clear foil stamping on it. Its hard to see in the picture but a pretty cool 3D-like effect.

Being married to a Trekkie, I immediately knew what this pin was and that it was fromStar Trek. He loves this pin and will be adding it to hisStar Trekcollection!

Of course, you get the Sci-Fan edition of the Nerd Block Magazine in each box with articles related to the content as well as other Sci-Fan related information.

Sci-Fan Block by Nerd Block is a great choice for the person who follows is into Sci-Fi and Fantasy. Its a great mixture of both and brings an overall quality box. You have a little less than a month to head over and sign up for the July box.

Website: nerdblock.com/scifan

Facebook: facebook.com/NerdBlock

Twitter: @NerdBlock

I am the wife of a real nerd who has developed some nerd-like tendencies along the way.I am a mom of two, with dreams of becoming a graphic designer and writer while working at the local community college full-time.Earned a B.S. in Communication from University of Louisville and plan to eventually get my Masters Degree...They say it's never too late to start!

Read the original here:

Sci-Fan Block: Cyborg, Ghost in the Shell and More! Review - The Nerd Stash (press release)

Coach Winklejohn Says Holly Holm Would Fight Cyborg – Fightful (press release) (registration)


Fightful (press release) (registration)
Coach Winklejohn Says Holly Holm Would Fight Cyborg
Fightful (press release) (registration)
Coach Mike Winklejohn has helped lead Holly Holm to lots of success in boxing and MMA, including helping Holm win the UFC Women's Bantamweight Title at UFC 193. When it comes to the subject of Cristiane Cyborg Justino, the famed MMA coach says ...
Coach Winkeljohn: Holm Would Take Money Fight with 'Chemically Changed' Cyborgchampions.co
The fireworks before today's actual showThe New Paper

all 200 news articles »

Read more here:

Coach Winklejohn Says Holly Holm Would Fight Cyborg - Fightful (press release) (registration)

Water quality good at most Massachusetts beaches; Issues remain at some urban spots – Wicked Local Medfield

THE ISSUE: Although water quality is generally good at Massachusetts beaches, issues remain in some areas.

THE IMPACT: An average of 4.9 percent of samples from marine beaches and 3.8 percent of samples from freshwater beaches test positive for elevated bacteria levels.

Rain can put a damper on summer fun in a variety of ways.

Not only does heavy rain keep people indoors, but it also can overflow sewer systems and carry garbage to the coast, sometimes causing a temporary spike in unsafe bacteria levels at beaches.

Theres filthy, bacteria-laden storm water, which typically gets to the beach after running into storm drains in the road, said Bruce Berman, a spokesman for Save the Harbor/Save the Bay. When you think about rain, it washes everything in the streets into storm drains.

Water quality in Massachusetts beaches is generally good, Berman said, but some issues remain, particularly around urban beaches.

The vast majority of the time, issues are minimal.

Last summer, state and local agencies collected a total of 15,604 water samples from 586 marine beach sites and 594 freshwater beach sites. About 3.5 percent of samples from ocean beaches and 3 percent of freshwater samples tested positive for elevated bacteria levels, compared to historic averages of 4.9 percent and 3.8 percent respectively. Last years drought, according the public health officials, was likely a factor in lower bacteria levels.

Click on the markers to find out more about the beaches at which bacteria tested high enough to close a beach to swimmers during the 2016 summer season. Blue markers are at beaches that were closed for one day, yellow markers are at beaches that were closed for two days, orange markers are at beaches closed for three days, pink markers are at beaches closed for four or five days, and red markers are at beaches that were closed for 10 or more days.

Wicked Local Graphic/Caitlyn Kelleher

Overall, Massachusetts beaches have excellent water quality, said Dr. Marc A. Nascarella, chief toxicologist and director of the Department of Public Healths Environmental Toxicology Program.

A challenge for beaches, particularly those in urban areas, is old sewer infrastructure, which can cause underground sewer pipes to leak into stormwater pipes when theres heavy rain.

Rainfall is the most significant driver of bacteria exceedances in Massachusetts, Nascarella said.

Last summer, there were 160 no swimming postings at marine beaches, with beaches in Boston, Lynn and Quincy being closed the most often. Most closures were due to high bacteria levels, but rip currents, shark sightings and other factors also caused some postings.

At inland, freshwater beaches there were 114 postings in 2016, with beaches in Brimfield, Templeton and West Tisbury reporting the highest number of high-bacteria samples. In addition to bacteria, algae blooms often caused by fertilizer runoff caused closures at freshwater beaches.

Overall, Massachusetts has 529 public marine and 549 freshwater public beaches.

Human fecal matter can enter beach water in a variety of ways, including sewage treatment system failures, combined sewer overflows, discharge of sewage by boats, re-suspension of sediments, and rainfall and resulting surface runoff, Nascarella said.

Exposure to high concentrations of fecal bacteria can cause symptoms including gastrointestinal sickness, cold symptoms and skin rashes.

Berman said neglecting infrastructure decades ago caused water quality problems, and investing in repairs is a main part of the solution.

Thirty years ago, Boston Harbor was a national disgrace, he said. Our waste washed up on shore from Cape Cod to Cape Ann. Today, were talking about elevated bacteria on handful of beaches that we need to address. We have a lot of progress to be proud of. We just have to finish the job.

See the rest here:

Water quality good at most Massachusetts beaches; Issues remain at some urban spots - Wicked Local Medfield

Quit tossing fish guts into shark-filled waters, Long Beach lifeguards say – Los Angeles Times

With a sizzling heat wave expected to drive uncounted Southern California residents to area beaches this weekend, lifeguards in Long Beach are urging anglers and film crews to stop attracting sharks with bloody chum.

The warning comes at a time of increased shark activity along the California coast. A seasonal glut of juvenile great white sharks has prompted the temporary closure of several beaches and resulted in the injury of at least two surfers.

Long Beach safety officials said they were alarmed recently to hear reports of film crews tossing fish guts near shore in order to draw the predators closer to their cameras.

There is no shot worth somebodys life we are all responsible for maintaining public safety, said firefighter-paramedic Jake Heflin, spokesman for the Long Beach Fire Department.

Lifeguards have recently spotted film crews as well as commercial and recreational fishing vessels throwing chum into the water to lure sharks near the Long Beach harbor, according to Heflin.

When lifeguards approached the groups, the activity often stopped, he said.

On one occasion recently, lifeguards intervened before a film crew with National Geographic could begin chumming, Heflin said. He said the crew was cooperative.

The recent chumming activity has been a significant public safety issue for the fire department, he said. As the department gears up for the busy summer beach season and prepares to launch its junior lifeguard program, Heflin said, officials are worried that swimmers could have an unpleasant encounter with a shark lured to the shore with bait.

I know everybody wants the picture, but you put peoples lives at risk, Heflin said. You really have to question why you are doing that.

Lifeguards have been looking for chumming activity along the shore, but enforcement has been difficult, he said. They must observe individuals tossing fish into the ocean to be able to issue a citation, for polluting the water, Heflin said.

When we do see it, we are citing, Heflin said.

Along with the citations, lifeguards have been conducting additional patrols of the water, he said. Signs posted along the waters edge warn beachgoers of shark sightings.

Over the spring, the fire department received numerous reports of sharks in the waters off Peninsula Beach. Juvenile great white sharks, typically 5 to 6 feet long, have been regularly spotted near the shore.

Experts said the California coast is swarming with young sharks attracted to its safe ecosystem. Young sharks feed off a rich supply of sting rays, and pregnant female sharks prefer Southern Californias warmer water for gestation.

Its a nursery for young sharks, said Chris Lowe, head of the Shark Lab at Cal State Long Beach.

But there is no way to tell how many sharks inhabit the waters off Southern California, he said.

More state and federal protections have allowed the predator population to thrive over the past 20 years.

But feeding the sharks could change their behavior, Lowe said.

Sharks could become complacent and rely on chum to survive. They could become aggressive and also linger in the waters for longer periods.

In general, its a bad idea, he said.

veronica.rocha@latimes.com

Twitter: VeronicaRochaLA

See more here:

Quit tossing fish guts into shark-filled waters, Long Beach lifeguards say - Los Angeles Times

One beach closed in Quincy, all others open – Wicked Local Hingham

All 65 South Shore beaches that are tested by state and local officials for water quality are open. That includes Wollaston, Nantasket and Plymouth beaches.

See water quality test results for each community and for Cape Cod, the South Coast and North Shore.

For more on Quincy beaches, call 617-376-1288, or visit ledger-quincy-beaches. For more on Wollaston Beach, call 617-626-4972.

HOW BEACHES ARE TESTED

Eighty-five beaches on the South Shore are tested for intestinal bacteria found in humans and animals.

High levels indicate the possible presence of disease-causing microbes that are present in sewage but are more difficult to detect. Bacterial colonies are filtered from three ounces of water and placed on a gel infused with nutrients and chemicals designed to promote growth.

Left in an incubator, the single cells isolated on the filter grow explosively, forming colonies visible to the naked eye. After one day, the colonies are counted and if they exceed 104 colonies, the beach is closed to swimming. If the past five samples have a geometric mean exceeding 35 colonies, the beach must also be closed to swimming.

Link:

One beach closed in Quincy, all others open - Wicked Local Hingham

If you like to swim at Lake Erie beaches, our wet spring is bad news – Buffalo News

Last summers drought kept the sand and surf at Lake Eries beaches the cleanest in five years, so the beaches usually remained open.

But the region's second-wettest spring on record has changed the outlook for this year's swimming season. Beachgoers in 2017 can expect beaches to be closed to swimming more often.

The increased rainfall nearly 17.5 inches since March 1 brings more runoff into creeks and streams, sewer overflows, erosion and turbidity and trickier beach forecasting.

Woodlawn State Park beach has already been closed to swimming more than half the time since opening just before Memorial Day. Swimming at nearby Hamburg Town Park has been closed about a third of the time.

Infrastructure upgrades offer some hope that swimming will be allowed more often. Erie Countys recently spent $16 million to eliminate wastewater overflows from three pump stations into Rush and Blasdell creeks near Woodlawn.

Other places, like Evans and Dunkirk, are embarking on sustainable green infrastructure projects to capture stormwater before it reaches Lake Erie.

The state Department of Environmental Conservation funded a pair of studies at Lake Erie Beach in Evans and Point Gratiot in Dunkirk to start the process.

Part of the reason these beaches have been prioritized is because of the number of beach closures over the past few years, said Shannon Dougherty, the DECs western Great Lakes watershed coordinator.

Closing time

The beach at Wright Park in Dunkirk, where the number of closures dropped significantly in 2016. (Robert Kirkham/News file photo)

Summertime closings at Lake Erie's beaches have been a sign of the times, especially in recent years, after high bacterial levels became commonplace.

Beach closings at seven Lake Erie beaches Woodlawn State Park, Hamburg, Bennett, Evans, Wright Park and Point Gratiot dropped by nearly 65 percent from 2015 to 2016, according to the U.S. Environmental Protection Agency'sBeach Advisory and Closing Online Notification (BEACON) system.

The unusually dry spring and summer drove most of that reduction. Buffalo recorded only 6.57 inches of rain between March 1 and June 15, 2016.

Rainfall is one of about three dozen factors the beach forecasting model takes into account when projecting whether a beach is safe for swimming. Local officials depend on the model when making a final decision.

Site-specific measurements of the wind, its direction, wave heights, shoreline currents, water levels on tributaries to the beach, water turbidity and numerous other factors are also taken.

"Each beach has its own data set," saidBrett Hayhurst, a water quality specialist at the U.S. Geological Survey's New York Water Science Center in Ithaca.

In Erie County, officials from the county health department also collect daily water samples from the beaches to test bacteria levels and compare them against the model's forecast.

"We're finding we're closing less of the time when the water quality is actually good," saidDolores Funke, the county's environmental health director, "and keeping the beach open on the bad water quality days less."

A plan for Evans Town Park

Over the last decade, Evans Town Beach has been closed to swimmers for 176 summer days because of poor water quality.

The town now has a plan and the $172,125 needed to fix it.

Designs call for bioswayles drifting into rain gardens in Evans Town Park near the beach, according to Roberta L. Rappoccio, the town's director of planning and community development.

Why not beautify, be green and serve a purpose? Rappoccio said.

If successful, the green infrastructure project should alleviate stormwater flows from nearby Fernbrook Creek that often flood the area including a pedestrian tunnel under Lake Shore Road that joins the park and beach with up to several feet of water during heavy rain storms.

Were getting 25 year storms every year, Rappoccio said. Thats something were not equipped for.

Rappoccio said those rain gardens, which will include native plantings, will be designed to capture storm water and slowly filter it before it eventually finds its way a few hundred yards away onto the town beach.

The work is expected to be completed later this year. It is being funded through a federal grant from the EPA.

Whats more, after nearby Mickey Rats expected closing at the end of the summer, Rappoccio said the towns new waterfront design standards will require the use of green infrastructure on any future redevelopment of the site.

Theres no way to do it otherwise, she said. Theres no question about it.

Projects like this to capture and filter stormwater isn't new to the folks at Buffalo Niagara Riverkeeper, but they're heartened that the message is starting to spread.

"Nature wants the eastern basin of Lake Erie to be dominated bycoastal wetland, though human settlement over the last 200 yearshas altered these systems almost beyond recognition," said Jill Jedlicka, Riverkeeper's executive director. "Protecting and restoring Great Lakes living infrastructure is a cost-effective approach to flood protection, filteringdrinking water, andecosystem improvements."

Lake Erie Beach

Views from a closed Lake Erie Beach on Aug. 14, 2009. (Harry Scull Jr,/News file photo)

Just west of Evans Town Park, Lake Erie Beach at Evans Point Breeze community has experienced similar troubles.

Since 2008, it has been closed 160 times.

One culprit: aged septic tanks that leak into Muddy Creek from cottages a half-century or more old.

We know its an issue, Rappoccio said.

The town spent about $75,000 from a state water quality improvement grant to design a plan to help clean up Lake Erie Beach.

It involves a two-phase approach with native plantings and a wetland area that can capture, slow and filter stormwater before it reaches Lake Erie.

Theres no timetable yet for construction, but the town is actively pursuing funding to complete that work.

Who knows? Rappoccio said. Maybe well be one of the communities where well be known as one of these green tourism areas.

Dunkirk's Point Gratiot Park

Waves crash into the cliffs below the lighthouse in Dunkirk. (Mark Mulville/News file photo)

Dunkirks Rebecca Yanus is directing the citys efforts to help filter and clean the water at Point Gratiot Park beach.

Like Evans, Dunkirk hopes a series of five strategically placed rain gardens with several native planting areas will help alleviate flooding in the area and run-off onto the beach and into the lake.

Run-off would come through the rain garden, Yanus said. Its very important we do that at different spots on the property.

A DEC-funded study shows the rain gardens will be filled with a native bedding stone and surrounded by prairie grass and dozens of spots for native plants like the cardinal flower, black-eyed Susans and purple coneflowers, among others.

Two of the rain gardens at West Oak Street and Park Drive will come with accompanying bioswales.

The bedding will be mostly rocks and shale, Yanus said. A lot of the materials that will be used will be materials that are found right along the beach.

That will help defray additional construction costs for the project, which is estimated to be about $140,000.

Point Gratiot has fewer closures than most spots closer to Buffalo, but it remains in the double-digits for each year dating back to 2010, according to EPA data. The highest 28 closures came in 2010 and 2013.

Dunkirk officials hope the DECs involvement in the early part of the process bodes well when the city applies to the agency this summer for a grant to fund the project. If funded, work would be done in 2018, Yanus said.

Were hoping it will bring our water quality up to par so many more people can enjoy our beach, she said.

A big project at Woodlawn

Beachgoers enjoying Woodlawn Beach State Park. (John Hickey/Buffalo News)

At Woodlawn State Park, where theres more pavement and more runoff than at rural parks, cleaning the water will take a lot more than a rain garden or two.

A 2010 state parks report updated in 2015cited five main culprits for impaired water quality at Woodlawn: stormwater outfalls, urban run-off, contaminated stream discharge, algal and leaf debris, and sewage overflows.

The countys expensive Rush Creek Interceptor project addressed the overflows.

It made it possible for us to eliminate the Blasdell Treatment Plant there, pump stations and those overflows, said Joseph Fiegl, Erie Countys deputy commissioner for the Division of Sewerage Management.

Since Nov. 19, there have been no overflows from the traditionally problematic pump stations at Electric Avenue, Blasdell and Labelle, even with some of the extreme weather this spring.

Even so, its been of small consolation for Woodlawns beachgoers this spring.

The beach was closed for six straight days over the Memorial Day holiday from May 28 to June 2 and has been closed five more times since June 5, according to Nowcast beach data.

It's not simply a beach problem, Fiegl said.Woodlawn Beachs water quality problem requires a wholesale solution.

"This is a watershed issue, he said.

Read the original here:

If you like to swim at Lake Erie beaches, our wet spring is bad news - Buffalo News

Indiana Dunes beaches are ready for summer | Lake County News … – nwitimes.com

Beaches at Indiana Dunes National Lakeshore and Indiana Dunes State Park have been readied for summer, and continued testing after an April chemical spill in a Lake Michigan tributary indicates the water is safe, a National Park Service spokesman said.

The park service began working with EPA to establish a long-term monitoring program after U.S. Steel's Midwest Plant spilled 298 pounds of hexavalent chromium into the Burns Waterway near Lake Michigan in April.

The spill was about 584 times the daily maximum limit allowed under state permitting laws, Indiana Department of Environmental Managementdocuments show.

EPA said in a June news release that water testing found "no hexavalent chromium impacts to the Burns Waterway or Lake Michigan." The park service will continue to test waters once a week through Aug. 30, National Lakeshore Supervisory Park Ranger Bruce Rowe said.

Preparations for summer included cleaning up about 17 miles of beaches, placing buoys in the water to mark swim beaches, erecting lifeguards stands, and moving tons of sand that covered parking lots and walkways during the off-season, Rowe said. The West Beach bathhouse and restroom facilities at Kemil, Porter, Lakeview and Dunbar beaches also were opened.

The park service plans to reopen access to the beach at Mount Baldy later this summer. The Mount Baldy area has been closed since a then-6-year-old boy was swallowed by sand in a freak accident in 2013. The boy survived.

Extensive testing found the beach to be safe, and the park service plans to open it after work on an access trail is completed, according to a news release. The Mount Baldy dune will remain closed, with the exception of ranger-led tours.

For the first time in decades, the number of lifeguards at Indiana Dunes State Park has been increased. Swimming is allowed only when lifeguards are on duty, the release said.

Swimming will be permitted seven days a week at the eastern and western portions of the beach. In past years, swimming was allowed only on the western section during weekdays.

There will be a family-friendly fireworks program June 29 at Indiana Dunes State Park and a sand sculpture contest July 8.

West Beach visitors can try out kayaks and paddleboards in Lake Michigan at 1 p.m. Fridays during the free Beach Fun Friday program, which will be led by park staff. After sunset, visitors will be invited to gather around a beach campfire to enjoy stories and roast marshmallows. The program runs through Sept. 1.

Read the original post:

Indiana Dunes beaches are ready for summer | Lake County News ... - nwitimes.com

Reg flags up as high rip current risk for all beaches today – WITN

EMERALD ISLE, NC (WITN) The high risk for dangerous rip currents will continue into Saturday for all of our beaches.

The National Weather Service warns that the surf remains dangerous for all swimmers, no matter how experienced they are.

On Friday, red flags were flying at Emerald Isle and Pine Knoll Shores, while Atlantic Beach wss flying yellow warning flags.

This past weekend, two Wayne County teens got caught in dangerous rips at Emerald Isle. One boy drowned, while the other was critically injured.

The high risk area is from North Topsail Beach in Onslow County to Duck in Dare County.

Meteorologists say the worst time for rip currents on Friday will be a couple hours before or after high tide, which is around 7:00 p.m.

On Thursday, life guards at Emerald Isle rescued 13 people, bringing the number to 25 for the week just in that beach town.

Previous Story

Emerald Isle lifeguards say this summer is shaping up to be one of their busiest in recent years with 13 more rescues of swimmers from rip currents Thursday.

That brings the total number of rescues this week at Emerald Isle to 25.

One person drowned while another was taken to the hospital in critical condition. Those rescued Thursday are all said to be fine.

Jordan East says this is her 6th year as a lifeguard at Emerald Isle and is starting out as the busiest.

Jordan says, "I think most of the time people come down here they see our flags they see the water, they don't really know how to recognize it and we try our best to educate the public."

Other Crystal Coast beaches have had a number of rescues as well. Atlantic Beach has had 14 rescues in the past three weeks. Salter Path has had 5 rescues and 10 distress calls, and North Topsail Beach has had 4 distress calls. Distress calls mean emergency crews were called to the beach for someone who may have been panicking out in the ocean, but no one had to be pulled from the water.

Lifeguards advise you swim at beaches where there are lifeguards, and say if you get caught in a rip, swim parallel to shore until you are out of it.

The rip current risk along the coast remains in the moderate range.

See the article here:

Reg flags up as high rip current risk for all beaches today - WITN

Chinese astronomy satellite placed into orbit by Long March rocket – Spaceflight Now

Chinas Long March 4B rocket lifts off Thursday with the Hard X-ray Modulation Telescope. Credit: Xinhua

Chinas first X-ray astronomy satellite launched Thursday on a mission to survey the Milky Way galaxy for black holes and pulsars, the remnants left behind after a star burns up its nuclear fuel.

The Hard X-ray Modulation Telescope will also detect gamma-ray bursts, the most violent explosions in the universe, and try to help astronomers link the outbursts with gravitational waves, unseen ripples through the cosmos generated by cataclysmic events like supernova explosions and mergers of black holes.

The orbiting X-ray observatory, renamed Huiyan, or Insight, following Thursdays launch, is Chinas first space telescope and second space mission dedicated to astronomy after a Chinese particle physics probe was sent into orbit in 2015 to search for evidence of dark matter.

Before its launch, we could only use second-hand observation data from foreign satellites, said Xiong Shaolin, a scientist at the Institute of High Energy Physics of the Chinese Academy of Sciences. It was very hard for Chinese astronomers to make important findings without our own instruments.

The only way to make original achievements is to construct our own observation instruments, Xiong said in a report by Chinas state-run Xinhua news agency.Now Chinese scientists have created this space telescope with its many unique advantages, and its quite possible we will discover new, strange and unexpected phenomena in universe.

The X-ray telescope launched at 0300 GMT Thursday (11 p.m. EDT Wednesday) aboard a Long March 4B rocket from the Jiuquan space center in northwestern Chinas Gobi Desert. Liftoff occurred at 11 a.m. Thursday Beijing time.

The Long March 4B booster, powered by three hydrazine-fueled stages, delivered the Huiyan telescope into a 335-mile-high (540-kilometer) orbit tilted 43 degrees to the equator, according to tracking data released by the U.S. military. That is very close to the X-ray telescopes intended operating orbit.

Ground controllers plan to activate and test the observatory over the next five months before entering service late this year, fulfilling a mission first proposed by Chinese scientists in 1994 and formally approved by the Chinese government in 2011, according to the Chinese Academy of Sciences.

The 5,500-pound (2,500-kilogram) Huiyan spacecraft is designed for a four-year mission. Its three X-ray instruments, sorted to observe low, medium and high-energy X-rays, are sensitive to 1,000 to 250,000 electron volts, an energy range that encompasses the energy of a medical X-ray.

Earths atmosphere absorbs X-ray light signals, so astronomers must build and launch satellites for the job. X-ray observatories are uniquely suited for studies of black holes and neutron stars, two of the densest types of objects in the universe created in the aftermath of supernovas, the explosions at the end of a stars life.

Unlike X-ray telescopes launched by NASA and the European Space Agency, Chinas Huiyan mission does not use grazing mirrors, which must be extremely flat to reflect high-frequency X-ray waves. Chinese officials said they do not have the expertise to build such flat mirrors, so scientists came up with a backup plan that does not rely on traditional imaging.

The observing method, called demodulation, can help reconstruct the image of X-ray sources by using data from relatively simple non-imaging detectors, such as a telescope with collimators that collects and records X-ray photons parallel to a specified direction, Xinhua reported.

Scientists said the Chinese X-ray telescope will be able to observe brighter targets than other X-ray missions because the demodulation method diffuses X-ray light. Other telescopes reflect and focus X-ray photons onto detectors.

No matter how bright the sources are, our telescope wont be blinded, said Chen Yong, chief designer of Huiyans low-energy X-ray instrument, in an interview with Xinhua.

We are looking forward to discovering new activities of black holes and studying the state of neutron stars under extreme gravity and density conditions, and physical laws under extreme magnetic fields, said Zhang Shuangnan, the X-ray missions lead scientist. These studies are expected to bring new breakthroughs in physics.

Another set of detectors on the Huiyan telescope, originally added to shield against background noise, can be adjusted to make the observatory sensitive to even higher-energy gamma rays, according to the Xinhua news agency.

The detection of gravitational waves by ground-based sensors in Washington and Louisiana opened a new door in astronomy. Created by distant collisions and explosions, gravitational waves are ripples through the fabric of spacetime, and astronomers now seek to connect the phenomena with events seen by conventional telescopes.

Since gravitational waves were detected, the study of gamma-ray bursts has become more important, Zhang said in Xinhuas report on the mission. In astrophysics research, its insufficient to study just the gravitational wave signals. We need to use the corresponding electromagnetic signals, which are more familiar to astronomers, to facilitate the research on gravitational waves.

The launch of the Huiyan space telescope comes as NASA scientists turn on and calibrate another X-ray instrument recently delivered to the International Space Station.

After its launch June 3 on a SpaceX supply ship heading to the space station, NASAs Neutron Star Interior Composition Explorer will spend the next 18 months studying the structure and behavior of neutron stars.

Three other satellites joined Chinas Huiyan spacecraft on Thursdays launch.

The OVS 1A and 1B satellites are the first two members of a commercial constellation of Earth-imaging craft for Zhuhai Orbita Control Engineering Co. Ltd. based in southern Chinas Guangdong province. The two 121-pound (55-kilogram) satellites will record high-resolution video from orbit, and future spacecraft in the Zhuhai 1 fleet will collect hyperspectral and radar imagery.

TheuSat 3 microsatellite owned by Satellogic, an Argentine company, was also aboard the Long March 4B rocket Thursday.

Built in Montevideo, Uruguay, by a Satellogic subsidiary company,uSat 3 weighs around 80 pounds (37 kilograms) and is identical to twouSat satellites launched on a Chinese rocket in May 2016.

Each uSat craft hosts cameras to capture imagery in color, infrared and in the hyperspectral regime, which gives analysts additional information about the makeup of objects, plants and terrain in Earth observation products. The satellites can resolve features on Earth as small as 3.3 feet (1 meter) across.

uSat 3 is nicknamed Milanesat, after the traditional Argentine steak dish Milanesa. The first twouSat satellites launched last year were named after Argentine desserts.

Satellogic is one of several privately-funded companies launching sharp-eyed commercial Earth-viewing satellites to collect daily images of the entire planet. The company says its satellite constellation, which could eventually number from 25 to several hundred spacecraft, will help urban planners, emergency responders, crop managers, and scientists tracking the effects of climate change.

Email the author.

Follow Stephen Clark on Twitter: @StephenClark1.

See the original post:

Chinese astronomy satellite placed into orbit by Long March rocket - Spaceflight Now

Prairie Astronomy Club will have solar telescopes set up – Lincoln Journal Star

The monthly meeting of the Prairie Astronomy Club is 7:30 p.m. Tuesday (June 27) at Hyde Memorial Observatory on the south side of Holmes Park. The club will have a special solar observing event that same evening starting at 6 p.m.

Club members will have special solar telescopes set up to safely look at the sun. The public is encouraged to come and observe the sun through these telescopes. The Total Solar Eclipse on Aug. 21 is considered a once-in-a-lifetime event. Youll want to be as prepared as possible to enjoy this occurrence.

Come to the Prairie Astronomy Clubs monthly meeting June 27 and let us assist you in being as prepared as possible, says club spokesman Jim Kvasnicka.

The Prairie Astronomy Club will answer any questions you have regarding the upcoming Total Solar Eclipse which will occur Aug. 21 at the clubs monthly meeting in June.

Topics to be discussed are likely to include:

- When will the eclipse begin?

- How long will it last?

- What will I see?

- What do I need to safely look at the sun?

More here:

Prairie Astronomy Club will have solar telescopes set up - Lincoln Journal Star

Timeline of artificial intelligence – Wikipedia

Date Development Antiquity Greek myths of Hephaestus and Pygmalion incorporated the idea of intelligent robots (such as Talos) and artificial beings (such as Galatea and Pandora).[1] Antiquity Yan Shi presented King Mu of Zhou with mechanical men.[2] Antiquity Sacred mechanical statues built in Egypt and Greece were believed to be capable of wisdom and emotion. Hermes Trismegistus would write "they have sensus and spiritus ... by discovering the true nature of the gods, man has been able to reproduce it." Mosaic law prohibits the use of automatons in religion.[3] 384 BC322 BC Aristotle described the syllogism, a method of formal, mechanical thought. 1st century Heron of Alexandria created mechanical men and other automatons.[4] 260 Porphyry of Tyros wrote Isagog which categorized knowledge and logic.[5] ~800 Geber develops the Arabic alchemical theory of Takwin, the artificial creation of life in the laboratory, up to and including human life.[6] 1206 Al-Jazari created a programmable orchestra of mechanical human beings.[7] 1275 Ramon Llull, Spanish theologian invents the Ars Magna, a tool for combining concepts mechanically, based on an Arabic astrological tool, the Zairja. The method would be developed further by Gottfried Leibniz in the 17th century.[8] ~1500 Paracelsus claimed to have created an artificial man out of magnetism, sperm and alchemy.[9] ~1580 Rabbi Judah Loew ben Bezalel of Prague is said to have invented the Golem, a clay man brought to life.[10] Early 17th century Ren Descartes proposed that bodies of animals are nothing more than complex machines (but that mental phenomena are of a different "substance").[11] 1623 Wilhelm Schickard drew a calculating clock on a letter to Kepler. This will be the first of five unsuccessful attempts at designing a direct entry calculating clock in the 17th century (including the designs of Tito Burattini, Samuel Morland and Ren Grillet)).[12] 1641 Thomas Hobbes published Leviathan and presented a mechanical, combinatorial theory of cognition. He wrote "...for reason is nothing but reckoning".[13][14] 1642 Blaise Pascal invented the mechanical calculator,[15] the first digital calculating machine[16] 1672 Gottfried Leibniz improved the earlier machines, making the Stepped Reckoner to do multiplication and division. He also invented the binary numeral system and envisioned a universal calculus of reasoning (alphabet of human thought) by which arguments could be decided mechanically. Leibniz worked on assigning a specific number to each and every object in the world, as a prelude to an algebraic solution to all possible problems.[17] 1726 Jonathan Swift published Gulliver's Travels, which includes this description of the Engine, a machine on the island of Laputa: "a Project for improving speculative Knowledge by practical and mechanical Operations " by using this "Contrivance", "the most ignorant Person at a reasonable Charge, and with a little bodily Labour, may write Books in Philosophy, Poetry, Politicks, Law, Mathematicks, and Theology, with the least Assistance from Genius or study."[18] The machine is a parody of Ars Magna, one of the inspirations of Gottfried Leibniz' mechanism. 1750 Julien Offray de La Mettrie published L'Homme Machine, which argued that human thought is strictly mechanical.[19] 1769 Wolfgang von Kempelen built and toured with his chess-playing automaton, The Turk.[20] The Turk was later shown to be a hoax, involving a human chess player. 1818 Mary Shelley published the story of Frankenstein; or the Modern Prometheus, a fictional consideration of the ethics of creating sentient beings.[21] 18221859 Charles Babbage & Ada Lovelace worked on programmable mechanical calculating machines.[22] 1837 The mathematician Bernard Bolzano made the first modern attempt to formalize semantics. 1854 George Boole set out to "investigate the fundamental laws of those operations of the mind by which reasoning is performed, to give expression to them in the symbolic language of a calculus", inventing Boolean algebra.[23] 1863 Samuel Butler suggested that Darwinian evolution also applies to machines, and speculates that they will one day become conscious and eventually supplant humanity.[24] Date Development 1913 Bertrand Russell and Alfred North Whitehead published Principia Mathematica, which revolutionized formal logic. 1915 Leonardo Torres y Quevedo built a chess automaton, El Ajedrecista and published speculation about thinking and automata.[25] 1923 Karel apek's play R.U.R. (Rossum's Universal Robots) opened in London. This is the first use of the word "robot" in English.[26] 1920s and 1930s Ludwig Wittgenstein and Rudolf Carnap lead philosophy into logical analysis of knowledge. Alonzo Church develops Lambda Calculus to investigate computability using recursive functional notation. 1931 Kurt Gdel showed that sufficiently powerful formal systems, if consistent, permit the formulation of true theorems that are unprovable by any theorem-proving machine deriving all possible theorems from the axioms. To do this he had to build a universal, integer-based programming language, which is the reason why he is sometimes called the "father of theoretical computer science". 1941 Konrad Zuse built the first working program-controlled computers.[27] 1943 Warren Sturgis McCulloch and Walter Pitts publish "A Logical Calculus of the Ideas Immanent in Nervous Activity" (1943), laying foundations for artificial neural networks.[28] 1943 Arturo Rosenblueth, Norbert Wiener and Julian Bigelow coin the term "cybernetics". Wiener's popular book by that name published in 1948. 1945 Game theory which would prove invaluable in the progress of AI was introduced with the 1944 paper, Theory of Games and Economic Behavior by mathematician John von Neumann and economist Oskar Morgenstern. 1945 Vannevar Bush published As We May Think (The Atlantic Monthly, July 1945) a prescient vision of the future in which computers assist humans in many activities. 1948 John von Neumann (quoted by E.T. Jaynes) in response to a comment at a lecture that it was impossible for a machine to think: "You insist that there is something a machine cannot do. If you will tell me precisely what it is that a machine cannot do, then I can always make a machine which will do just that!". Von Neumann was presumably alluding to the Church-Turing thesis which states that any effective procedure can be simulated by a (generalized) computer. Date Development 1950 Alan Turing proposes the Turing Test as a measure of machine intelligence.[29] 1950 Claude Shannon published a detailed analysis of chess playing as search. 1950 Isaac Asimov published his Three Laws of Robotics. 1951 The first working AI programs were written in 1951 to run on the Ferranti Mark 1 machine of the University of Manchester: a checkers-playing program written by Christopher Strachey and a chess-playing program written by Dietrich Prinz. 19521962 Arthur Samuel (IBM) wrote the first game-playing program,[30] for checkers (draughts), to achieve sufficient skill to challenge a respectable amateur. His first checkers-playing program was written in 1952, and in 1955 he created a version that learned to play.[31] 1956 The first Dartmouth College summer AI conference is organized by John McCarthy, Marvin Minsky, Nathan Rochester of IBM and Claude Shannon. 1956 The name artificial intelligence is used for the first time as the topic of the second Dartmouth Conference, organized by John McCarthy[32] 1956 The first demonstration of the Logic Theorist (LT) written by Allen Newell, J.C. Shaw and Herbert A. Simon (Carnegie Institute of Technology, now [[Carnegie Mellon University] or CMU]). This is often called the first AI program, though Samuel's checkers program also has a strong claim. 1957 The General Problem Solver (GPS) demonstrated by Newell, Shaw and Simon while at CMU. 1958 John McCarthy (Massachusetts Institute of Technology or MIT) invented the Lisp programming language. 1958 Herbert Gelernter and Nathan Rochester (IBM) described a theorem prover in geometry that exploits a semantic model of the domain in the form of diagrams of "typical" cases. 1958 Teddington Conference on the Mechanization of Thought Processes was held in the UK and among the papers presented were John McCarthy's Programs with Common Sense, Oliver Selfridge's Pandemonium, and Marvin Minsky's Some Methods of Heuristic Programming and Artificial Intelligence. 1959 John McCarthy and Marvin Minsky founded the MIT AI Lab. Late 1950s, early 1960s Margaret Masterman and colleagues at University of Cambridge design semantic nets for machine translation. Date Development 1960s Ray Solomonoff lays the foundations of a mathematical theory of AI, introducing universal Bayesian methods for inductive inference and prediction. 1960 Man-Computer Symbiosis by J.C.R. Licklider. 1961 James Slagle (PhD dissertation, MIT) wrote (in Lisp) the first symbolic integration program, SAINT, which solved calculus problems at the college freshman level. 1961 In Minds, Machines and Gdel, John Lucas[33] denied the possibility of machine intelligence on logical or philosophical grounds. He referred to Kurt Gdel's result of 1931: sufficiently powerful formal systems are either inconsistent or allow for formulating true theorems unprovable by any theorem-proving AI deriving all provable theorems from the axioms. Since humans are able to "see" the truth of such theorems, machines were deemed inferior. 1961 Unimation's industrial robot Unimate worked on a General Motors automobile assembly line. 1963 Thomas Evans' program, ANALOGY, written as part of his PhD work at MIT, demonstrated that computers can solve the same analogy problems as are given on IQ tests. 1963 Edward Feigenbaum and Julian Feldman published Computers and Thought, the first collection of articles about artificial intelligence. 1963 Leonard Uhr and Charles Vossler published "A Pattern Recognition Program That Generates, Evaluates, and Adjusts Its Own Operators", which described one of the first machine learning programs that could adaptively acquire and modify features and thereby overcome the limitations of simple perceptrons of Rosenblatt 1964 Danny Bobrow's dissertation at MIT (technical report #1 from MIT's AI group, Project MAC), shows that computers can understand natural language well enough to solve algebra word problems correctly. 1964 Bertram Raphael's MIT dissertation on the SIR program demonstrates the power of a logical representation of knowledge for question-answering systems. 1965 J. Alan Robinson invented a mechanical proof procedure, the Resolution Method, which allowed programs to work efficiently with formal logic as a representation language. 1965 Joseph Weizenbaum (MIT) built ELIZA, an interactive program that carries on a dialogue in English language on any topic. It was a popular toy at AI centers on the ARPANET when a version that "simulated" the dialogue of a psychotherapist was programmed. 1965 Edward Feigenbaum initiated Dendral, a ten-year effort to develop software to deduce the molecular structure of organic compounds using scientific instrument data. It was the first expert system. 1966 Ross Quillian (PhD dissertation, Carnegie Inst. of Technology, now CMU) demonstrated semantic nets. 1966 Machine Intelligence[34] workshop at Edinburgh the first of an influential annual series organized by Donald Michie and others. 1966 Negative report on machine translation kills much work in Natural language processing (NLP) for many years. 1967 Dendral program (Edward Feigenbaum, Joshua Lederberg, Bruce Buchanan, Georgia Sutherland at Stanford University) demonstrated to interpret mass spectra on organic chemical compounds. First successful knowledge-based program for scientific reasoning. 1968 Joel Moses (PhD work at MIT) demonstrated the power of symbolic reasoning for integration problems in the Macsyma program. First successful knowledge-based program in mathematics. 1968 Richard Greenblatt (programmer) at MIT built a knowledge-based chess-playing program, MacHack, that was good enough to achieve a class-C rating in tournament play. 1968 Wallace and Boulton's program, Snob (Comp.J. 11(2) 1968), for unsupervised classification (clustering) uses the Bayesian Minimum Message Length criterion, a mathematical realisation of Occam's razor. 1969 Stanford Research Institute (SRI): Shakey the Robot, demonstrated combining animal locomotion, perception and problem solving. 1969 Roger Schank (Stanford) defined conceptual dependency model for natural language understanding. Later developed (in PhD dissertations at Yale University) for use in story understanding by Robert Wilensky and Wendy Lehnert, and for use in understanding memory by Janet Kolodner. 1969 Yorick Wilks (Stanford) developed the semantic coherence view of language called Preference Semantics, embodied in the first semantics-driven machine translation program, and the basis of many PhD dissertations since such as Bran Boguraev and David Carter at Cambridge. 1969 First International Joint Conference on Artificial Intelligence (IJCAI) held at Stanford. 1969 Marvin Minsky and Seymour Papert publish Perceptrons, demonstrating previously unrecognized limits of this feed-forward two-layered structure. This book is considered by some to mark the beginning of the AI winter of the 1970s, a failure of confidence and funding for AI. Nevertheless, significant progress in the field continued (see below). 1969 McCarthy and Hayes started the discussion about the frame problem with their essay, "Some Philosophical Problems from the Standpoint of Artificial Intelligence". Date Development Early 1970s Jane Robinson and Don Walker established an influential Natural Language Processing group at SRI. 1970 Jaime Carbonell (Sr.) developed SCHOLAR, an interactive program for computer assisted instruction based on semantic nets as the representation of knowledge. 1970 Bill Woods described Augmented Transition Networks (ATN's) as a representation for natural language understanding. 1970 Patrick Winston's PhD program, ARCH, at MIT learned concepts from examples in the world of children's blocks. 1971 Terry Winograd's PhD thesis (MIT) demonstrated the ability of computers to understand English sentences in a restricted world of children's blocks, in a coupling of his language understanding program, SHRDLU, with a robot arm that carried out instructions typed in English. 1971 Work on the Boyer-Moore theorem prover started in Edinburgh.[35] 1972 Prolog programming language developed by Alain Colmerauer. 1972 Earl Sacerdoti developed one of the first hierarchical planning programs, ABSTRIPS. 1973 The Assembly Robotics Group at University of Edinburgh builds Freddy Robot, capable of using visual perception to locate and assemble models. (See Edinburgh Freddy Assembly Robot: a versatile computer-controlled assembly system.) 1973 The Lighthill report gives a largely negative verdict on AI research in Great Britain and forms the basis for the decision by the British government to discontinue support for AI research in all but two universities. 1974 Ted Shortliffe's PhD dissertation on the MYCIN program (Stanford) demonstrated a very practical rule-based approach to medical diagnoses, even in the presence of uncertainty. While it borrowed from DENDRAL, its own contributions strongly influenced the future of expert system development, especially commercial systems. 1975 Earl Sacerdoti developed techniques of partial-order planning in his NOAH system, replacing the previous paradigm of search among state space descriptions. NOAH was applied at SRI International to interactively diagnose and repair electromechanical systems. 1975 Austin Tate developed the Nonlin hierarchical planning system able to search a space of partial plans characterised as alternative approaches to the underlying goal structure of the plan. 1975 Marvin Minsky published his widely read and influential article on Frames as a representation of knowledge, in which many ideas about schemas and semantic links are brought together. 1975 The Meta-Dendral learning program produced new results in chemistry (some rules of mass spectrometry) the first scientific discoveries by a computer to be published in a refereed journal. Mid-1970s Barbara Grosz (SRI) established limits to traditional AI approaches to discourse modeling. Subsequent work by Grosz, Bonnie Webber and Candace Sidner developed the notion of "centering", used in establishing focus of discourse and anaphoric references in Natural language processing. Mid-1970s David Marr and MIT colleagues describe the "primal sketch" and its role in visual perception. 1976 Douglas Lenat's AM program (Stanford PhD dissertation) demonstrated the discovery model (loosely guided search for interesting conjectures). 1976 Randall Davis demonstrated the power of meta-level reasoning in his PhD dissertation at Stanford. 1978 Tom Mitchell, at Stanford, invented the concept of Version spaces for describing the search space of a concept formation program. 1978 Herbert A. Simon wins the Nobel Prize in Economics for his theory of bounded rationality, one of the cornerstones of AI known as "satisficing". 1978 The MOLGEN program, written at Stanford by Mark Stefik and Peter Friedland, demonstrated that an object-oriented programming representation of knowledge can be used to plan gene-cloning experiments. 1979 Bill VanMelle's PhD dissertation at Stanford demonstrated the generality of MYCIN's representation of knowledge and style of reasoning in his EMYCIN program, the model for many commercial expert system "shells". 1979 Jack Myers and Harry Pople at University of Pittsburgh developed INTERNIST, a knowledge-based medical diagnosis program based on Dr. Myers' clinical knowledge. 1979 Cordell Green, David Barstow, Elaine Kant and others at Stanford demonstrated the CHI system for automatic programming. 1979 The Stanford Cart, built by Hans Moravec, becomes the first computer-controlled, autonomous vehicle when it successfully traverses a chair-filled room and circumnavigates the Stanford AI Lab. 1979 BKG, a backgammon program written by Hans Berliner at CMU, defeats the reigning world champion. 1979 Drew McDermott and Jon Doyle at MIT, and John McCarthy at Stanford begin publishing work on non-monotonic logics and formal aspects of truth maintenance. Late 1970s Stanford's SUMEX-AIM resource, headed by Ed Feigenbaum and Joshua Lederberg, demonstrates the power of the ARPAnet for scientific collaboration. Date Development 1980s Lisp machines developed and marketed. First expert system shells and commercial applications. 1980 First National Conference of the American Association for Artificial Intelligence (AAAI) held at Stanford. 1981 Danny Hillis designs the connection machine, which utilizes Parallel computing to bring new power to AI, and to computation in general. (Later founds Thinking Machines Corporation) 1982 The Fifth Generation Computer Systems project (FGCS), an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a "fifth generation computer" (see history of computing hardware) which was supposed to perform much calculation utilizing massive parallelism. 1983 John Laird and Paul Rosenbloom, working with Allen Newell, complete CMU dissertations on Soar (program). 1983 James F. Allen invents the Interval Calculus, the first widely used formalization of temporal events. Mid-1980s Neural Networks become widely used with the Backpropagation algorithm (first described by Paul Werbos in 1974). 1985 The autonomous drawing program, AARON, created by Harold Cohen, is demonstrated at the AAAI National Conference (based on more than a decade of work, and with subsequent work showing major developments). 1986 The team of Ernst Dickmanns at Bundeswehr University of Munich builds the first robot cars, driving up to 55mph on empty streets. 1986 Barbara Grosz and Candace Sidner create the first computation model of discourse, establishing the field of research.[36] 1987 Marvin Minsky published The Society of Mind, a theoretical description of the mind as a collection of cooperating agents. He had been lecturing on the idea for years before the book came out (c.f. Doyle 1983).[37] 1987 Around the same time, Rodney Brooks introduced the subsumption architecture and behavior-based robotics as a more minimalist modular model of natural intelligence; Nouvelle AI. 1987 Commercial launch of generation 2.0 of Alacrity by Alacritous Inc./Allstar Advice Inc. Toronto, the first commercial strategic and managerial advisory system. The system was based upon a forward-chaining, self-developed expert system with 3,000 rules about the evolution of markets and competitive strategies and co-authored by Alistair Davidson and Mary Chung, founders of the firm with the underlying engine developed by Paul Tarvydas. The Alacrity system also included a small financial expert system that interpreted financial statements and models.[38] 1989 Dean Pomerleau at CMU creates ALVINN (An Autonomous Land Vehicle in a Neural Network). Date Development Early 1990s TD-Gammon, a backgammon program written by Gerry Tesauro, demonstrates that reinforcement (learning) is powerful enough to create a championship-level game-playing program by competing favorably with world-class players. 1990s Major advances in all areas of AI, with significant demonstrations in machine learning, intelligent tutoring, case-based reasoning, multi-agent planning, scheduling, uncertain reasoning, data mining, natural language understanding and translation, vision, virtual reality, games, and other topics. 1991 DART scheduling application deployed in the first Gulf War paid back DARPA's investment of 30 years in AI research.[39] 1993 Ian Horswill extended behavior-based robotics by creating Polly, the first robot to navigate using vision and operate at animal-like speeds (1 meter/second). 1993 Rodney Brooks, Lynn Andrea Stein and Cynthia Breazeal started the widely publicized MIT Cog project with numerous collaborators, in an attempt to build a humanoid robot child in just five years. 1993 ISX corporation wins "DARPA contractor of the year"[40] for the Dynamic Analysis and Replanning Tool (DART) which reportedly repaid the US government's entire investment in AI research since the 1950s.[41] 1994 With passengers on board, the twin robot cars VaMP and VITA-2 of Ernst Dickmanns and Daimler-Benz drive more than one thousand kilometers on a Paris three-lane highway in standard heavy traffic at speeds up to 130km/h. They demonstrate autonomous driving in free lanes, convoy driving, and lane changes left and right with autonomous passing of other cars. 1994 English draughts (checkers) world champion Tinsley resigned a match against computer program Chinook. Chinook defeated 2nd highest rated player, Lafferty. Chinook won the USA National Tournament by the widest margin ever. 1995 "No Hands Across America": A semi-autonomous car drove coast-to-coast across the United States with computer-controlled steering for 2,797 miles (4,501km) of the 2,849 miles (4,585km). Throttle and brakes were controlled by a human driver.[42][43] 1995 One of Ernst Dickmanns' robot cars (with robot-controlled throttle and brakes) drove more than 1000 miles from Munich to Copenhagen and back, in traffic, at up to 120mph, occasionally executing maneuvers to pass other cars (only in a few critical situations a safety driver took over). Active vision was used to deal with rapidly changing street scenes. 1997 The Deep Blue chess machine (IBM) defeats the (then) world chess champion, Garry Kasparov. 1997 First official RoboCup football (soccer) match featuring table-top matches with 40 teams of interacting robots and over 5000 spectators. 1997 Computer Othello program Logistello defeated the world champion Takeshi Murakami with a score of 60. 1998 Tiger Electronics' Furby is released, and becomes the first successful attempt at producing a type of A.I to reach a domestic environment. 1998 Tim Berners-Lee published his Semantic Web Road map paper.[44] 1998 Leslie P. Kaelbling, Michael Littman, and Anthony Cassandra introduce the first method for solving POMDPs offline, jumpstarting widespread use in robotics and automated planning and scheduling[45] 1999 Sony introduces an improved domestic robot similar to a Furby, the AIBO becomes one of the first artificially intelligent "pets" that is also autonomous. Late 1990s Web crawlers and other AI-based information extraction programs become essential in widespread use of the World Wide Web. Late 1990s Demonstration of an Intelligent room and Emotional Agents at MIT's AI Lab. Late 1990s Initiation of work on the Oxygen architecture, which connects mobile and stationary computers in an adaptive network. Date Development 2000 Interactive robopets ("smart toys") become commercially available, realizing the vision of the 18th century novelty toy makers. 2000 Cynthia Breazeal at MIT publishes her dissertation on Sociable machines, describing Kismet (robot), with a face that expresses emotions. 2000 The Nomad robot explores remote regions of Antarctica looking for meteorite samples. 2002 iRobot's Roomba autonomously vacuums the floor while navigating and avoiding obstacles. 2004 OWL Web Ontology Language W3C Recommendation (10 February 2004). 2004 DARPA introduces the DARPA Grand Challenge requiring competitors to produce autonomous vehicles for prize money. 2004 NASA's robotic exploration rovers Spirit and Opportunity autonomously navigate the surface of Mars. 2005 Honda's ASIMO robot, an artificially intelligent humanoid robot, is able to walk as fast as a human, delivering trays to customers in restaurant settings. 2005 Recommendation technology based on tracking web activity or media usage brings AI to marketing. See TiVo Suggestions. 2005 Blue Brain is born, a project to simulate the brain at molecular detail.[46] 2006 The Dartmouth Artificial Intelligence Conference: The Next 50 Years (AI@50) AI@50 (1416 July 2006) 2007 Philosophical Transactions of the Royal Society, B Biology, one of the world's oldest scientific journals, puts out a special issue on using AI to understand biological intelligence, titled Models of Natural Action Selection[47] 2007 Checkers is solved by a team of researchers at the University of Alberta. 2007 DARPA launches the Urban Challenge for autonomous cars to obey traffic rules and operate in an urban environment. 2009 Google builds self driving car.[48] Date Development 2010 Microsoft launched Kinect for Xbox 360, the first gaming device to track human body movement, using just a 3D camera and infra-red detection, enabling users to play their Xbox 360 wirelessly. The award-winning machine learning for human motion capture technology for this device was developed by the Computer Vision group at Microsoft Research, Cambridge.[49][50] 2011 IBM's Watson computer defeated television game show Jeopardy! champions Rutter and Jennings. 2011 Apple's Siri, Google's Google Now and Microsoft's Cortana are smartphone apps that use natural language to answer questions, make recommendations and perform actions. 2013 Robot HRP-2 built by SCHAFT Inc of Japan, a subsidiary of Google, defeats 15 teams to win DARPAs Robotics Challenge Trials. HRP-2 scored 27 out of 32 points in 8 tasks needed in disaster response. Tasks are drive a vehicle, walk over debris, climb a ladder, remove debris, walk through doors, cut through a wall, close valves and connect a hose.[51] 2013 NEIL, the Never Ending Image Learner, is released at Carnegie Mellon University to constantly compare and analyze relationships between different images.[52] 2015 An open letter to ban development and use of autonomous weapons signed by Hawking, Musk, Wozniak and 3,000 researchers in AI and robotics.[53] 2015 Google DeepMind's AlphaGo defeated 3 time European Go champion 2 dan professional Fan Hui by 5 games to 0.[54] 2016 Google DeepMind's AlphaGo defeated Lee Sedol 4-1. Lee Sedol is a 9 dan professional Korean Go champion who won 27 major tournaments from 2002 to 2016.[55] Before the match with AlphaGo, Lee Sedol was confident in predicting an easy 5-0 or 4-1 victory.[56] 2017 Google DeepMind's AlphaGo won 60-0 rounds on two public Go websites including 3 wins against world Go champion Ke Jie.[57] 2017 Libratus, designed by Carnegie Mellon professor Tuomas Sandholm and his grad student Noam Brown won against four top players at no-limit Texas hold 'em, a very challenging version of poker. Unlike Go and Chess, Poker is a game in which some information is hidden (the cards of the other player) which makes it much harder to model.[58]

Read the rest here:

Timeline of artificial intelligence - Wikipedia

Is The Concern Artificial Intelligence Or Autonomy? : 13.7 … – NPR – NPR

There's a provocative interview with the philosopher Daniel Dennett in Living on Earth.

The topic is Dennett's latest book From Bacteria to Bach and Back: The Evolution of Minds and his idea that Charles Darwin and Alan Turing can be credited, in a way, with the same discovery: that you don't need comprehension to achieve competence.

Darwin showed how you can get the appearance of purpose and design out of blind processes of natural selection. And Turing, one of the pioneers in the field of computation, offered evidence that any problem precise enough to be computed at all, can be computed by a mechanical device that is, a device without an iota of insight or understanding.

But the part of the interview that particularly grabbed my attention comes at the end. Living on Earth host Steve Curwood raises the, by now, hoary worry that as AI advances, machines will come to lord over us. This is a staple of science fiction and it has recently become the focus of considerable attention among opinion-makers. (Discussion of the so-called "singularity.") Dennett acknowledges that the risk of takeover is a real one. But he says we've misunderstood it: The risk is not that machines will become autonomous and come to rule over us the risk is, rather, that we will come to depend too much on machines.

The big problem AI faces is not the intelligence part, really. It's the autonomy part. Finally, at the end of the day, even the smartest computers are tools, our tools and their intentions are our intentions. Or, to the extent that we can speak of their intentions at all for example of the intention of a self-driving car to avoid an obstacle we have in mind something it was designed to do.

Even the most primitive organism, in contrast, at least seems to have a kind of autonomy. It really has its own interests. Light. Food. Survival. Life.

The danger of our growing dependence on technologies is not really that we are losing our natural autonomy in quite this sense. Our needs are still our needs. But it is a loss of autonomy, nonetheless. Even auto mechanics these days rely on diagnostic computers and, in the era of self-driving cars, will any of us still know how to drive? Think what would happen if we lost electricity, or if the grid were really and truly hacked? We'd be thrown back into the 19th century, as Dennett says. But in many ways, things would be worse. We'd be thrown back but without the knowledge and know-how that made it possible for our ancestors to thrive in the olden days.

I don't think this fear is unrealistic. But we need to put it in context. The truth is, we've been technological since our dawn as a species. We first find ourselves in the archaeological record precisely there where we see a great exposition of tools, technologies, art-making and also linguistic practices. In a sense, to be human is to be cyborgian that is, a technological extended version of our merely biological selves. This suggests that at any time in our development, a large-scale breakdown in the technological infrastructure would spell not exactly our doom, but our radical reorganization.

Perhaps what makes our current predicament unprecedented is the fact that we are so densely networked. When the library of Alexandria burned down, books and, indeed, knowledge, were lost. But in a world where libraries are replaced by their online versions, it isn't inconceivable that every library could be, simply, deleted.

What happens to us then?

Follow this link:

Is The Concern Artificial Intelligence Or Autonomy? : 13.7 ... - NPR - NPR

The growing impact of artificial intelligence on workplace collaboration – CIO

So Artificial Intelligence (AI) is all the rage these days. AI and machine learning algorithms are increasingly being brought to bear on collaborative business workflows and processes for automation and to enable intelligent conversational experiences. There is a paradigm shift in digital workplace technologies and strategies to make enterprises more conversational and smarter. Such conversational environments require data, content, people, applications and overall technology to be in an intimate contextual flow and persistent.

The incorporation of AI brings us to a new era of intelligent conversational environments and workspaces. Were seeing a bevy of technology providers respond with sometimes grandiose product announcements, but new products all the same, to address these requirements and play in this space. While some have clearly exaggerated their capabilities, there is significant potential to revolutionize workplace collaboration on the horizon

The emerging focus on AI is really about making decision support systems more efficient across a multitude of applications, processes and business domains. I believe AI will bring intelligent collaboration capabilities to the emerging Conversational Workspace platforms, represented by vendors/offerings such as Slack, Atlassian HipChat, Microsoft Teams, Workplace by Facebook, Unify Circuit, Cisco Spark, RingCentral Glip, 8x8 Sameroom, MindLink, IBM Watson Workspace, ALE Rainbow, Fuze, Google Hangouts Chat, Jive and Nextplane nCore. So I just rattled off a long list of providers and offerings here in no particular order, because I want to make it abundantly clear that there is significant momentum. AI and chatbots are being incorporated in these offerings to improve workflows and to support conversational experiences.

Clearly, collaboration is critical for any organization to succeed. Businesses need to interact efficiently with both internal and external parties and constituents. The most effective way to nurture a collaborative workplace is to foster a culture in which collaboration and engagement are respected and rewarded.

What Ive referred to as Intelligent collaboration, is really about the application of intelligence to collaborative interactions to achieve deeper insights that produce better decision-making at all points in the process. It may include virtual or voice assistance to make interactions easier and more automated. (The best-known virtual assistants are Apples Siri, Amazons Alexa, Google Assistant and Microsofts Cortana.) Leading AI systems include a library of process-level routines or bots to assist in automating repetitive tasks. We expect other providers will bring similar capabilities to the market.

This brings us to an interesting market showdown of collaboration providers investing heavily in AI. The major technology vendors are now vying for dominance in the overall artificial intelligence space. There have been strong moves by major players such as Amazon, Apple, Google, IBM and Microsoft in this space. Interestingly enough, it's being rumored that Amazon is among other players interested in a Slack takeover. Slack could potentially be valued at $9 billion in a sale. Amazon and Slack together could potentially be an AI powerhouse with Slack's AI, the AWS developer ecosystem, Alexa and intelligent bots.

Google and Microsoft have both established strategic research divisions in the area of AI. Google benefits from its rich search inventory and deep investments in machine learning. At its May 17, 2017, Google I/O developers conference keynote, Google announced google.ai, will be the place to access everything its working on in AI. Google leverages its AI and machine learning capabilities already in its collaboration and productivity offerings.

In like manner, Microsoft, at its recent Build 2017 conference unveiled its Microsoft Cognitive Services offering of AI capabilities for developers. Microsoft already incorporates its AI capabilities throughout the Office 365 content, collaboration and productivity suite. Along with the Microsoft Bot Framework, the idea is to support conversational experiences. Both Microsoft and Google are trying to expand their ecosystem of partners and developers here.

However, this is more than Google and Microsoft. Every collaboration vendor is trying to navigate its way here. Acquisitions have been a critical strategy to advance AI capabilities. For example, Cisco just announced its acquisition of MindMeld for $125M, which enables the deployment of AI enabled conversational interfaces. I expect to see initial integration into Ciscos collaboration offerings such as Spark, its conversational workspace platform. Additionally, business applications providers focused on collaborative workflows and improving customer digital experiences such as Adobe and Salesforce, have made significant strides in AI with Sensei and Einstein respectively.

So what does all this mean? I cite the previous examples as the signals to a fundamental shift in workplace collaboration and collaborative workflows. What vendors are responding to is the direct impact of digital disruption and transformation, which places increasing importance on enterprise collaboration workflows, information flows and a refocus on creating better user experiences for the people involved in critical business processes. Adding AI and all its flavors such as machine learning, natural language processing (NLP) and the use of chatbots, will usher in a new wave of intelligent communications and collaboration, which will potentially enable better conversational experiences.

This article is published as part of the IDG Contributor Network. Want to Join?

Excerpt from:

The growing impact of artificial intelligence on workplace collaboration - CIO

Facebook will use artificial intelligence to detect and remove terrorist content on the social network – Mirror.co.uk

Facebook on Thursday offered new insight into its efforts to remove terrorism content, a response to political pressure in Europe to militant groups using the social network for propaganda and recruiting.

Facebook has ramped up use of artificial intelligence such as image matching and language understanding to identify and remove content quickly, Monika Bickert, Facebook's director of global policy management, and Brian Fishman, counterterrorism policy manager, explained in a blog post .

Facebook uses artificial intelligence for image matching that allows the company to see if a photo or video being uploaded matches a known photo or video from groups it has defined as terrorist, such as Islamic State, Al Qaeda and their affiliates, the company said.

YouTube, Facebook, Twitter and Microsoft last year created a common database of digital fingerprints automatically assigned to videos or photos of militant content to help each other identify the same content on their platforms.

Similarly, Facebook now analyses text that has already been removed for praising or supporting militant organisations to develop text-based signals for such propaganda.

"More than half the accounts we remove for terrorism are accounts we find ourselves, that is something that we want to let our community know so they understand we are really committed to making Facebook a hostile environment for terrorists," Bickert said.

Germany, France and Britain, countries where civilians have been killed and wounded in bombings and shootings by Islamist militants in recent years, have pressed Facebook and other social media sites such as Google and Twitter to do more to remove militant content and hate speech.

Government officials have threatened to fine the company and strip the broad legal protections it enjoys against liability for the content posted by its users.

Asked why Facebook was opening up now about policies that it had long declined to discuss, Bickert said recent attacks were naturally starting conversations among people about what they could do to stand up to militancy.

In addition, she said, "we're talking about this is because we are seeing this technology really start to become an important part of how we try to find this content."

See original here:

Facebook will use artificial intelligence to detect and remove terrorist content on the social network - Mirror.co.uk

What’s the Difference Between AI and Machine Learning? – Machine Design

Buzzwords can help get a lot of attention on the web. But while these SEO keywords might help people find what they are looking for, they may also add fluff and garbage to searches. With terms like 3D printing, and IIoT eliciting such a positive response, theres no end in sight. Add artificial intelligence (AI), machine learning, neural networks, and deep learning into the mix, and it can be confusing to keep up with which is which. So, to begin:

AI: Artificial Intelligence (AI) is usually defined as the science of making computers do things that require intelligence when done by humans. AI has had some success in limited, or simplified, domains (Courtesy of AlanTuring.net).

First, there are different types of artificial intelligence (AI): weak and strong. Weak AI might behave as though a robot or manufacturing line is thinking on its own. However, its supervised programming, which means there is a programmed output, or action for given inputs.

Strong AI is a system that might actually change an output based on given goals and input data. A program could do something it wasnt programmed to if it notices a pattern and determines a more efficient way of accomplishing the goal it was given.

For example, when an AI program was instructed to obtain the highest score it could in the video game Breakout, it was able to learn how to perform better and was able to outperform humans in just 2.5 hours. Researchers let the program run. To their surprise, the program developed a strategy that was not in the software. It would focus on one spot of bricks to poke a hole so the ball would get behind the wall. This minimizes the work, as the computer no longer has to move the bat while the score would increase. This also minimizes the chances of missing the ball and ending the game.

Keep in mind that the computer isnt seeing the bat, ball, or rainbow stripped bricks. It sees a bunch of numbers. It knows what variables it controls, and how it is able to increase points based on how it controls the variables in relation to the other numbers.

"Under AI there are a lot of different technologies: Some of them exist and function, others are not yet mature, others are simply buzzwords, says Matteo Dariol, a product developer for Bosch. In my experience, in real-world manufacturing, I have not heard of anyone using AI for operations, it is more plausible that R&D centers are studying and testing certain algorithms. Some industrial components like PLC, drives, motors, already include certain neural networks that could fall under the wide umbrella of AI, typical applications are providing more energy efficiency or quicker reaction time."

AI has bled into a general term that could mean several things, including machine learning. Creating a lot of confusion is that some people associate AI with independent thinking. However, from the definition a machine vision application of picking up a part and setting it in a particular orientation. By definition this action is what a human would do, and requires some level of intelligence. It may not take much intelligence, but it does fit the AI definition.

Neural Network: A computer system modeled after the human brain.

Big Data: Essentially a large set or sets of data that are needed for programs to accurately use AI features. As things become more complexmoving from AI to machine learning or machine learning to deep learningthe more data you have, the better these systems will be able to learn and function.

Machine learning is sometimes associated with a neural network. Similar to how the human brain operates, neural networks have many connections between nodes and layers of nodes. Training algorithms can use neural networks, so when input in the form of data is entered the system, it will figure out, learn, decide, etc. what the best course of action is. Using a massive amount of data (often called Big Data) the algorithm and network learn how to accomplish goals and improve upon the process. This type of extensive connectivity is referred to as deep learning.

Deep Learning: Deep learning (also known as deep structured learning, hierarchical learning, or deep machine learning) is the study of artificial neural networks and related machine learning algorithms that contain more than one hidden layer. (Courtesy: Wikipedia)

Deep learning is a special type of machine-learning algorithmit is multiple layers of neural networks that mimic the connectivity of the brain, and these types of connectivity seem to work much better than pre-existing systems, said Samarjit Das, a senior research scientist at Bosch. We currently have to define parameters for machine learning based on our human experience. When we look at images of apples and oranges, we need to define features manually, so that machine-learning systems can identify the difference. Deep learning is the next level because it can create those distinctions on its own. By just showing sample images of apples and oranges to a deep-learning system, it will create its own rules realizing that color and geometry are the key features that distinguish which are which, and not have to teach it based off human knowledge.

Machine Learning: A type of AI that can include but isnt limited to neural networks and deep learning. Generally, it is the ability for a computer to output or do something that it wasnt programmed to do.

Read more:

What's the Difference Between AI and Machine Learning? - Machine Design