What You Can Do to Build an Innovative Ecosystem – CMSWire

PHOTO:Christina Morillo // wocintechchat

Innovation and research are no longer restricted to laboratories and research centers. Today, new ideas and innovative approaches emerge every day from people working on customer-facing situations and dealing with real-life issues. It is therefore essential to not only collect these ideas, but to encourage and reward them as well.

Companies such as Apple, Google, Samsung, 3M, Virgin Group, Nokia and Procter & Gamble have brought about significant changes in the nature of innovation itself. For these top performers, the winning equation is unassailable: Innovation equals growth. Innovation has evolved and taken many different forms over the last 50 years. The famous, patented 3M post-it was one such innovation. Since its invention, 3M has given creative time off to its employees, a concept many companies have embraced in recognition of the need to take a break in order to invent or re-invent an idea.

As Thomas Kuczmaski puts it,

There are inventors and there are innovators. One is creating a product with the dream of success. The other brings a product to market knowing with certainty that there is a need to be met. Understanding the difference and acting on it can provide an important stimulus for the economy in the challenging years ahead

The profligate and competitive society we live in means the ability to dictate changes and transformation provides a competitive advantage. Managing innovation and creativity is the key to this ability. Any organization that has resolved to tap the innovation potential of its employee base is halfway towards re-inventing or potentially producing some truly exceptional solutions.

Though many theories discuss organizational innovation, below are five simple steps to building an innovative ecosystem within the workplace. These strategies could be categorized into resource-based, organizational strategies as each of these elements includes resources as well as management strategies to encourage and inculcate innovation in employees, such that the entire organization works as an innovation platform, generating and capturing new ideas.

Generate an environment where creative ideas flourish, not just in R&D but throughout the organization, at every level. Consumers and frontline staff are in the best position to know what is needed and the ubiquitous availability of technology is creating innovation ecosystems out of the control of large corporations. Transformational leadership rebuilds traditional organizations to create innovative organizational climates, encouraging the creativity and innovation of its employees. Methods to create such an environment could be as simple as introducing dual career ladders, mentoring programs, technical conferences, jam and think sessions, brainstorming workshops, webinars and brown bags, and think tank events. All of these could be characterized as innovation events or workshops organized for the sole purpose of collaborating and generating ideas.

Related Article: Don't Wait for the Innovation Lottery: A Deliberate Approach to Ideation

Provide an opportunity to prove the idea and surface the innovation to those who can make the change. Research has found that creative people demonstrate high performances under personal autonomy. It is important to create this opportunity by providing autonomy to employees to process their thoughts and present their ideas. Some companies have instituted "Think Fridays," an excellent example of making space for creation.

Connect the innovator to the sponsors and the implementers. Fast connections between senior leadership and grassroots have proven to be the most important enabler for an innovative organization. Collaboration across the lines of hierarchy is one of the key elements in capturing new ideas and taking action. Building networking into the culture sparks communication across the silos and encourages and inspires new ideas, with the right cultural mindsets in place.

Related Article: DevOps and the Culture of Inclusion

Encourage diversity of thought and remove limiting assumptions. All organizations need to dispel and discourage the belief that disenfranchised groups cannot innovate. All groups need to be included in decision making so they can demonstrate their ability. Lack of diversity leads to two limiting assumptions: The dominant group is superior and so everyone should be (think) like them. Because of this superiority, it should naturally have power over the others.

Some examples of limiting assumptions in information technology are:

Related Article: IT Needs to Face Its Isms

Focus on the goal and dont measure the performance. Measuring innovative performance is perhaps the best way to stifle it. Research has shown that evaluating the innovation performance of organizations primarily based on positive outcomes may stifle the risky experimentation necessary for progress in difficult and unpredictable environments. A very high percentage of nonprofit and government innovation occurs in spite of the odds. Pushing innovation success while disregarding prevailing organizational hurdles may create negative outcomes and stifle innovation performance.

As Alan Kay puts it, The best way to predict the future is to invent it. Inventions happen in a culture where innovation is encouraged through culture rather than institutionalized in a process. Technological advancement and rising competition in the industrial and service companies have made innovation central to competitiveness. Organizations particularly technologically-driven ones need to be more innovative and pioneering than before to lead, to grow, to compete and to endure. Commercial organizations need to be efficient to survive in the short-term and encourage innovation and experimentation to survive in the long-term. With the advent of social media and technological advancements, customers have changed from being passive consumers to consumers as active participants. Organizations that have encouraged an all-round culture of innovation have seen the simultaneous emergence of new capabilities from technologies, to skills, to global scale and new disruptive business models and of new ways in which innovation happens. There are many theories of encouraging innovation across organizations. However, taking specific factors and specific steps to create a culture of independent thinking is critical for greater creativity and novelty at the organizational level.

Geetika Tandon is a senior director at Booz Allen Hamilton, a management and technology consulting firm. She was born in Delhi, India, holds a Bachelors in architecture from Delhi University, a Masters in architecture from the University of Southern California and a Masters in computer science from the University of California Santa Barbara.

The views and opinions expressed in these articles are those of the author and do not necessarily reflect the official policy or position of her employer.

The rest is here:

What You Can Do to Build an Innovative Ecosystem - CMSWire

TrueFort Expands Fortified Ecosystem with Infoblox and Others – Business Wire

WEEHAWKEN, N.J.--(BUSINESS WIRE)--TrueFort, the application detection and response company, today announced the continued expansion of the TrueFort Fortified Ecosystem. The company is building upon its previously announced partnership with CrowdStrike, and now adds Infoblox to the program.

To protect applications and enable organizations to achieve full, 360-degree understanding of their behavior and context, the TrueFort Fortress XDR platform has been optimized to consume vast amounts of real-time telemetry into its advanced analytics engine to be able to accurately identify internal and external threats across all vectors.

Without open integration and information sharing between the various security controls available today, malicious actors will continue to have great success attacking enterprises, said Ed Amoroso, CEO of TAG CYBER and former head of cybersecurity for AT&T. Ask any Chief Information Security Officer (CISO) today what risk they are most concerned about and the majority will point to threats that target business applications.

Todays announcement reinforces the companys commitment to its ecosystem approach, and in also helping customers extract maximum value from the TrueFort platform, and from their existing deployed investments in third-party products and data via open APIs and especially, bi-directional information sharing.

Through the Ecosystem Exchange model, Infoblox customers now have yet another way to extend the value of our Core Network Services data, said David Barry, Senior Director of Business Development at Infoblox. The TrueFort Fortress XDR platforms ability to consume our telemetry enhances its application profiling for better policy management, while providing increased insight into unmanaged systems to fast-track application-layer threat detection.

TrueFort also announced today its membership in the Center for Internet Security SecureSuite which provides organizations access to multiple cybersecurity resources including the CIS-CAT Pro configuration assessment tool, build content, full-format CIS Benchmarks, and more. In addition, TrueFort Fortress XDR is expanding its footprint of protected application environments with a new listing on the VMware Solution Exchange as a data center and network security solution.

To improve our customers security posture while reducing operational overhead, as vendors we need to ensure smooth integration and information sharing between toolsets, while following industry best practices like the CIS benchmarks, said Sameer Malhotra, CEO and Founder, TrueFort. Through initiatives like the TrueFort Fortified Ecosystem, we look forward to promoting industry collaboration in 2020 and beyond.

About TrueFort

Applications are the lifeblood of business. TrueFort helps organizations align application security policy with operational reality via Fortress XDR, the industrys first application detection and response platform. Fortress XDR reverses the traditional infrastructure approach to security by comprehensively tracking application behavior to unify cloud workload protection and AppSec in a single console. Using real-time telemetry, patented advanced behavioral analytics and policy automation, enterprises can now visualize, microsegment, protect, hunt and investigate from the application layer. Founded in 2015 by former Wall Street senior IT executives, TrueFort offers unparalleled application visibility, control and protection with the shortest time-to-value through the TrueFort Fortified ecosystem and our unique bring-your-own-agent approach. For more information visit http://www.truefort.com and follow us on Twitter and LinkedIn.

See the article here:

TrueFort Expands Fortified Ecosystem with Infoblox and Others - Business Wire

Driving up the value of AI starts with building an ecosystem – CIO Dive

NEW YORK Driven by a desire to boost efficiency, a manufacturing customer once tapped General Electric subsidiary GE Digital to infuse its facilities and supply chain with artificial intelligence.

Involved in the existing tech stack was a cloud computing provider. But as GE entered the design stage of the project, it ran across a barrier: The cloud company wanted to be the one to extract data from the manufacturing equipment, saying its methods were faster than GE's.

"We argued for three months over that," said Rachel Trombetta, principal enterprise architect of GE Digital, speaking on a panel at The AI Summit in New York Wednesday. "And during that time we were not able to provide any value."

Stepping back and allowing GE to take point onthe data extraction element of the project, while the cloud provider focused on the supply chain side, would have been a more collaborative approach to the project, Trombetta said.

To navigate the complexities of AI at the enterprise level, companies stand to benefit from taking an ecosystem approach to AI deployment, one which leverages industry strengths and seeks cooperation between multiple stakeholders. In this framework, customer needsshape the priorities of the parties involved.

Given the talent deficit in the AI field as well as tech as a whole there's often "not enough people" to deploy large-scale projects, which further marks the need for an ecosystem approach,Trombetta said.

Though AI deployment is a complex endeavor, its potential lures even the most conservative industries.

Precisely because of its complexity, AI is the ideal spacefor an ecosystem environment that steps past individual limitations, according to Gauthier Vasseur, executive director of the Fisher Center for Business Analytics at the University of California, Berkeley,speaking on the panel.

"In an AI project, everything is thrown at us at the same time," Vasseur said. "Our firepower to look at problems holistically is limited if we're by ourselves.It's the ultimate partnership playground."

As talks and panels unfolded at The AI Summit, dozens of vendors pitched their platforms on a crowded showroom.

Seeing clearly where the highest value lies in the swarm of platforms and products the market can offer is "the most challenging job of the 21stcentury," saidIgor Taber,SVP of corporate development and strategy at DataRobot.

"Industry doesn't make it easy tofigure out what's what," said Taber. In that context,partnershipscan ease the burden of understanding which providers to select.

In certain cases, though, the roles are reversed: AI and machine learning platforms can help companies determine which vendors to select, based on who can provide the products that most closelyalign with business needs and strategies.

Stepping past individual priorities and ceding ground on projects can be an uphill battle given tech's cut-throat competitiveness.

But larger rewards can await providers who adhere to an ecosystem framework. In the case of GE, a $3 million dollar project became a $5 million one after additional partners were brought in to address the needs of an Australian mining company.

Deloitte, one of the partners, had an existing business relationship with the client.GE was brought into the fold because of its industrial expertise and the AI that capabilities it could deploy at that layer.

"One of the things we suffered with was everyone wanted access to the data to prove their hypothesis first," Trombetta said. A valuable step was to collect the data and normalize it for its joint use across the project.

The result was a system that allowed the company to move minerals from a mining facility through manufacturing and to a nearby port on the day the mineral's pricewould be most beneficial.

Identifying ideal use cases for a partnership requires a customer-first approach. Though it may sound trite, customer interest isn't always the main driver for partnerships,Taber said.

"Taking AI for AI's sake is the first mistake we will do," said Vasseur. If stakeholders within a partnership lose sight of why AI is being deployed "then you'll be in big trouble."

Original post:

Driving up the value of AI starts with building an ecosystem - CIO Dive

OKEx Research: Ecosystem, and Innovative Products – newsBTC

OKEx is currently one of the biggest digital assets exchange and trading platforms that has achieved significant progress since its inception. The platform is known for its constant innovation, characterized by the introduction of new features for users at a regular frequency. In this post, OKEx gives a brief overview of their companys journey in recent days and the ecosystem they are creating.

Highlights

I. Correlation of OKEx Major Tokens

Overall relationships with each other became positive.

II. Microscope on Correlations

III. One step further with a new product: OKEx USDT margined future

After conducting simulation and beta testing,BTC futures contracts that are margined with the Tetherlaunched successfully on Nov 14th. Started with less than 1% of original BTC futures volume (coin margined) on day one, now shows more than half of daily volume to its brother, and still growing and getting attention from the market participants.

Here we put some figures for information and will integrate this into monthly correlation analysis, with other USDT margined futures.

As shown above, volatilities are moving almost together, however, the relativeness and crossing the other requires our attention.

Disclaimer: This material should not be taken as the basis for making investment decisions, nor be construed as a recommendation to engage in investment transactions. Trading digital assets involves significant risk and can result in the loss of your invested capital. You should ensure that you fully understand the risk involved and take into consideration your level of experience, investment objectives and seek independent financial advice if necessary.

Excerpt from:

OKEx Research: Ecosystem, and Innovative Products - newsBTC

The Unter-class is the Whole Ecosystem – Fairplanet

Long before global warming entered public consciousness, they considered its effects through maps, talks, and various other artistic means. They collaborated with biologists, historians, architects, and urban planners to initiate dialogues on biodiversity, climate change, and community development. Their artworks pushed the boundaries of what constitutes art. A conversation on the state of affairs.

FairPlanet: Is your home yourstudio space?

Newton Harrison: Yes, I actually designed my house in a way where every room is a thought-producing site.

You recently gave a lecture on your work in Berlin. Whats your connection to Europe?

Helen and I lived in Berlin while the wall was still up. We did work for documenta 8 during the 80s, which was called Kassel Works. The work was meant as a critique of documenta itself, and Kassels history, with the Gestapo headquarters that used to be located there. It was an out-front attack on the Hitler generated planning that shaped the rebuilding of Kassel after firebombing. When then President, von Weizscker saw this work, he immediately understood what we were doing and why and asked us to come work in Berlin. Thats how we ended up there. And thats when we started working in Europe much more heavily.

What projects stood out for you during this time?

We did Peninsula Europe, which was funded by the EU and the German government and took up about 300 square metres of museum space. The work addresses the high grounds of Europe from the perspective of its drain basins, its ecosystems and waters. At the time the work was done, in 1999, we believed that Europes water supply was going to get badly affected, partly by drought, partly by profound misuse of waters. So we argued that a new kind of forestry running across the central massive right into Portugal needed to happen. We proposed concrete reforestation and created a first map, which was basically re-mapping all of Europe. We designed new continental high grounds, however not along the outlines of Europe, but from where all the rivers began. Every map we designed is a complex narrative, and it is almost always about biodiversity.You need to understand: every regular map is almost always about how to get from here to there, and thereby places here and there. What we did was to create a new ecologically-driven narrative.

How would this narrative map influence Europe as a whole?

If applied, Europe would basically change into a large number of city-states. We yank out the roads, intensify the rivers, intensify the high grounds, take out all national boundaries. As simple as that.

What was the thought process behind this idea?

When you look at Portugal, you see water retention landscapes, that let you produce food in dry climates. We propose that something like 100,000 of those landscapes would be made. They would basically function like small villages. Eventually, this would eliminate monocultural farming and reinstate biodiversity in the countryside, while creating a new kind of farming. Thats the kind of transformation were looking at, one which not only fixes temporary problems, but looks for a transformation of the whole system.

You and Helen both used to work as scientists. At which moment did you move to art?

We were the first couple that ever shared a single professorship at a university. That was in the early sixties, and we were both very active in the peace movement. But we were looking for deeper ways of encounter and political action. Helen read this book, Silent Spring, by Rachel Carson, which influenced her greatly, and me, too. By 1969, we understood that extinction was an actual possibility and that global warming would happen. It was right in our faces! Any smart person who studied the figures could see this by 1970. So we decided to do work that benefited the ecosystem.Also we were not scientists, but artists able to do science when needed.

And how was that work received at the time?

Well, once we were asked to do a piece on endangered species for a show in New York, and we said, look, the most endangered species isnt actually the lion, tiger or the hippopotamus that everyone is so romantic about. The most endangered species is topsoil. About 12 million square miles of topsoil across the planet are being ruined because of all the carbon released to the atmosphere and the so-called green revolution which from the perspective of agri-business has turned into the green devolution. So, as one of our first projects, we made earth. We started these urban farming projects as art pieces. Its a hard process. In order to make rich topsoil, it takes three full months, about 2.000 earthworms, four tons of shit, leaf mould, little clay, little sand, and pissing on it from time to time.

How did this become art?

Everything I do or we did is art. We understood we had to change our field and become what I would call public service artists. Galleries started showing urban gardens, transplanted meadows and plantations of ours. But our research process was equally important. We could foresee almost everything were facing today. We created a map of the world in which the rising of ocean levels is predicted precisely.

Why is it so hard to change peoples mind on this issue?

Well, somehow, people back then in the early 70s, like people today, seem to prefer death and total transfiguration to the loss of capital. What does this tell you? Theres something deeply wrong with capitalism itself. You heard of Herbert Marcuse? He used to be at my university back in the days. In the early 70s he and I had a meeting, we were talking and he says to me: Newton, your work is as useless as the womens movement. And I said, Oh, really?. He said, Yes, its a form of regressive de-sublimation.Which was code for: Youre working on less important things than the class struggle. I lost my temper at him, I said Herbert, the Uber-class is the whole human race. The Unter-class is the whole ecosystem. Anybody can screw the Unter-class with a shovel.A few years later he came over and admitted I might have had a point here.

There was a point where your art projects were taken up by city planners, like in A Vision for the Green Heart of Holland project. Can you tell me about this?

Sure, they were going to put 600,000 houses, farming units, schools, etc. right into the centreof the so-called Green Heart. The Green Heart is a huge diverse farming operation centering a ring of cities including Amsterdam, Rotterdam, Utrecht and others. Its landforms represent the very history of Holland and formation of democracy. Its about 800 square kilometres, consisting of mainly small villages and windmills. They were going to give it up. Also, its the great central park of Holland. So we got called up to create a plan for the Green Heart, which we did, by generating a new plan. And by pointing out other areas where they could easily put 600,000 houses while still encouraging biodiversity in the urban, suburban and agricultural communities.

For a 1971 exhibit in Boston you created Hog Pasture, in whichyoucreated an actual pasture indoors hoping to have a real hog root in it, but the museum refused. In 2012, the Museum of Contemporary Art, Los Angeles allowed you to re-create the work with a real pig.

Yes, the pigs name was Wilma. It turned out that Wilma had never been in a natural environment before. She didnt know what to do with that hog pasture! Usually, pigs just walk around and eat the roots. So I had to push Wilmas nose into the ground, into the roots, and suddenly her tail started to wag 100 miles an hour. She dug up the whole artwork. Its an interesting thing to observe, a really, really happy pig.

What do you think is our most pressing challenge for the future?

Look, in America today, oil lobbyists are buying politicians for whatever purposes they want. Its not a democracy anymore. As long as the system keeps operating as it currently does, we will mostly die. Only small groups in small villages maybe near the north pole maybe can survive this. The last time planet earth underwent such a change from cold to warm, it took about 2,000 years, so there was relatively lots of time to adapt. Today itll take about 100 years. There simply is no time to adapt. Most of the conversations that have to happen, and that Helen and I tried to push for decades, is need to focus on what I call paleo-botanical research. We need to find out more about species that used to live in much warmer conditions and conducted a sort of migration not through space, but through time. With this knowledge one might slow down extinctions, so that there would be a fair survival rate. If we dont invest in such research, there wont be any increase of survival rate, civilisation will simply collapse. Or do you have a better idea?

Certainly not. In recent years, especially over this and last year, there is growing attention to the topic. Climate issues have driven elections. Whats your thoughts on this shift?

We have think about those issues in a long-term scale. Basically, what we have to do everywhere to help the climate today is take care of about 10.000 hectare of forests for about 200 years, and youll have an ancient forest, It generally takes 200-400 years. But in 20 years from now, its already going to get so much hotter that species will have to move and entire ecosystems will need to regroup at best. At worst, they will fail.

What are you working on at the moment?

Im working on a number of projects. One of them includes an idea for the borderline between North and South Korea. If they created a giant biodiversity corridor between their countries, this area could teach them how to rehab their countries ecologically as this becomes a greater necessity in the future.

Could this translate into politics?

Of course. In order to realise this idea, they would need a kind of giant central park between their countries. If you look at North Korea today, theres rarely a single tree standing. They did such bad farming and treated their people so bad when it got cold, so now theres hardly any trees left.

How does the ideal future look like for you?

I dont have one. But the things Id like to see happen are these: a complete revision of capitalism, so thats its understood that the ultimate capital is not money, but the overproduction of the life web itself. Look at crabs: the mother crab produces three million eggs. She expected, maybe, 30 eggs to live. So why did she make millions of eggs for few to live? Its a survival package that constitutes food for many other species which eat the eggs. Every species overproduces in this way to ensure their own survival. That is what our future capital needs to be. The human condition is not the most important thing in the world. The life web itself is the most important thing. All the philosophy that centres humans at its core, is wrong. We need a new Copernican revolution. Science can no longer be human-centric.

Read this article:

The Unter-class is the Whole Ecosystem - Fairplanet

Paige Raises $45M to Expand AI-Native Digital Pathology Ecosystem to Accelerate Biomarker Discovery – HIT Consultant

Paige raises $45M in Series B funding led by Healthcare Venture Partners with participation from Breyer Capital, Kenan Turnacioglu, and others.

The funding will be used to accelerate commercial efforts of its AI-native digital pathology ecosystem in the U.S., Europe, Brazil, and Canada.

Paige, a NYC-based leader in computational pathology transforming the diagnosis and treatment of cancer, today announced it has closed its Series B funding round of $45 million, bringing the Companys total capital raised to over $70 million. Healthcare Venture Partners brought the largest contribution to the round, with Breyer Capital, Kenan Turnacioglu, and other funds participating. Paige will use this new capital to drive FDA clearance of its products and expand its portfolio, delving deeper into cancer pathology, novel biomarkers, and prognostic capabilities. Additionally, the Company will accelerate commercial efforts in the U.S. and expansion in Europe, Brazil, and Canada.

Impact of Pathology on Cancer Diagnosis

Pathology is the cornerstone of cancer diagnoses. The field is on the cusp of a revolution towards digital, augmented clinical analysis. Paige aims to leverage cutting-edge AI and a vast, proprietary dataset to provide powerful new insights to pathologists, researchers, and pharmaceutical development teams.

Transforming the Diagnosis and Treatment of Cancer

Founded in 2018, Paiges mission is to revolutionize the diagnosis and treatment of cancer by providing pathologists, clinicians and researchers with insights drawn from decades of data diagnosed by world experts in cancer care. Spun out ofMemorial Sloan Kettering, Paigebuilds powerful, clinical-grade computational technologies to transform the diagnosis, treatment and biomarker discovery for cancer. With AI positioned to open a new future of pathology, Paige has created an AI-native digital pathology ecosystem that enables the Pathologist to achieve higher quality, faster throughput, and lower cost diagnosis and treatment recommendations. Additionally, Paige accelerates new biomarker discovery and is built to generate new insights into pathways and drug efficacy.

Medical AI at an Unprecedented Scale

The company plans to deliver the powerful technology via partnerships, such as the recently announced Philips deal and Paiges own AI-native platform. Paige has a comprehensive license with MSK and exclusive rights to its library of 25 million pathology slides one of the largest tumor pathology archives. Paige plans to build on to MSKs efforts and digitize millions of archived slides. This digital treasure, along with anonymized clinical data, allows them to train models at scale, and uncover new connections between Pathology, genomics, treatment response, and patient outcomes. In addition, Paige offers custom solutions for drug development teams: from pre-clinical modules to automated pathology analysis for clinical trials, and biomarker development, we are creating new possibilities to expedite and better inform teams bringing new therapeutics to the market.

The funding comes on the heels of a milestone year: Paige achieved the first FDA breakthrough designation for AI technology in Pathology and Oncology and later received the first CE mark in the space, added Thomas Fuchs, Founder of Paige and a researcher at Memorial Sloan Kettering (MSK). The Company also grew its digital slide archive to more than 1.2M images and is developing systems to combine digital slides with genomic, drug response and outcome information to create powerful new diagnostic solutions.

Follow this link:

Paige Raises $45M to Expand AI-Native Digital Pathology Ecosystem to Accelerate Biomarker Discovery - HIT Consultant

Dotscience Gains Momentum in the MLOps Ecosystem and Accelerates Deployment of Machine Learning Models into Production with New Technology…

LONDON & SAN FRANCISCO--(BUSINESS WIRE)--Delivering on its vision that ML engineering should be just as easy, fast and safe as modern software engineering when using DevOps techniques, Dotscience, the market leader in DevOps for Machine Learning (MLOps), today announced new partnerships with GitLab and Grafana Labs; deep integrations to include Scikit-learn, H2O.ai and TensorFlow; expanded multi-cloud support with Amazon Web Services (AWS) and Microsoft Azure; and a joint collaboration with global enterprises to develop an industry benchmark for helping enterprises get maximum ROI out of their AI initiatives.

MLOps is poised to dominate the enterprise AI conversation in 2020, as it will directly address the challenges enterprises face when looking to create business value with AI, said Luke Marsden, CEO and founder at Dotscience. Through new partnerships, expanded multi-cloud support, and collaborations with MLOps pioneers at global organizations in the Fortune 500, we are setting the bar for MLOps best practices for building production ML pipelines today.

Developing MLOps Gold Standard with Global Enterprises

AI-derived business value is forecast to reach $3.9 trillion in 2022. However, many businesses continue to struggle with deploying ML models into production, posing challenges around the value of AI in the enterprise. To address this challenge, Dotscience is collaborating with global enterprises to help them get the most from an early investment in AI.

"Dotscience not only gives productivity benefits to data scientists but also gives those in governance roles assurance that the firm is doing all it can to mitigate risk from AI," said Charles Radclyffe, head of AI at Fidelity International.

This collaboration builds upon the recently announced joint efforts with S&P Global to develop best practices for collaborative, end-to-end ML data and model management that ensure the delivery of business value from AI.

Dotscience Expands Partnerships to Help Enterprises Accelerate the Path to AI

Grafana Labs, the open observability platform, and Dotscience are partnering to deliver observability for ML in production. With Dotscience, ML teams can statistically monitor the behavior of ML models in production on unlabelled production data by analyzing the statistical distribution of predictions. The partnership dramatically simplifies the deployment of ML models to Kubernetes and adds the ability to set up monitoring dashboards for deployed ML models using cloud-native tools including Grafana and Prometheus, which reduces the time spent on these tasks from weeks to seconds.

At Grafana, we believe AI is a big growth opportunity for observability, said Tom Wilkie, VP of Product at Grafana Labs. With Dotscience, the process for training AI models is simplified. The integration with Grafana enables data science teams to monitor these trained models in production continuously. By bringing DevOps practices to ML, data science and ML teams can eliminate silos, maximize productivity and minimize MTTR if there are issues with a model that is being observed.

In a separate press release today, Dotscience also announced a native GitLab integration. As a GitLab Technology Partner, Dotscience is extending the use of its platform for collaborative, end-to-end ML data and model management to the more than 100,000 organizations and developers actively using GitLab as their DevOps platform.

Dotscience Increases Accessibility to Growing MLOps Ecosystem with Added Multi-Cloud Support

The Dotscience platform is available as SaaS or on-premises and empowers ML and data science teams in industries including fintech, autonomous vehicles, healthcare and consultancies to achieve reproducibility, accountability, collaboration and continuous delivery across the AI model lifecycle.

Dotscience is now available on the AWS Marketplace, enabling AWS customers to easily and quickly deploy Dotscience directly through AWS Marketplaces 1-Click Deployment, and through Microsoft Azure.

Finding the right software to meet your specific business needs can be challenging, particularly for data scientists and machine learning teams for whom the options have been limited, said Marsden. Extending the installation possibilities of Dotscience to include AWS and Azure gives more companies access to an integrated ML platform that provides the unified version control and collaboration these teams need to simplify, accelerate and control AI development.

Dotscience Expands Frameworks in Which Data Scientists Can Deploy ML Models

Dotscience has expanded the frameworks in which data scientists can deploy tested and trained ML models into production and statistically monitor the productionized models, to include Scikit-learn, H2O.ai and TensorFlow. These new integrations make Dotsciences recently added deploy and monitor platform advancementsthe easiest way to deploy and monitor ML models on Kubernetes clustersavailable to data scientists using a greater range of ML frameworks.

A key principle for Dotscience is that it has always been agnostic in terms of the exact framework data scientists can use for ML development, continued Marsden. As a next natural step in our product progression, we are enabling data scientists to deploy on their preferred ML framework.

In a separate press release today, Dotscience announced that leading open banking API provider, TrueLayer, has deployed Dotscience to enable reproducibility, provenance and metric tracking of AI models.

Additional Resources

About Dotscience

Dotscience, the pioneer in DevOps for machine learning (MLOps), brings DevOps principles followed by high-performing software teams to ML and data science. The Dotscience software platform for collaborative, end-to-end ML lifecycle management empowers ML and data science teams in industries including fintech, autonomous vehicles, healthcare and consultancies to achieve reproducibility, accountability, collaboration and continuous delivery across the AI model lifecycle. Founded in 2017 by container storage veteran Luke Marsden, Dotscience is headquartered in the UK with offices in the US. Its mission is to accelerate and unlock the true value of data and analytics assets through AI.

Read more:

Dotscience Gains Momentum in the MLOps Ecosystem and Accelerates Deployment of Machine Learning Models into Production with New Technology...

Rakshit Shetty talks about ‘Avane Srimannarayana’ and the eco-system that enables his work – The Hindu

In the nine years he has been in the Kannada film industry, Rakshit Shetty has been called an innovator, game-changer and part of the Kannada new wave, among other things. All these sit lightly on the Udupi-born actor, who has powered ahead, doing the films he likes, creating content that is deeply rooted, and not letting talk about the gap between films get to him.

For three years now, Rakshit has been dreaming only about Avane Srimannarayana (ASN), the mega-budget film he has co-written, which is produced by Pushkara Mallikarjunaiah and HK Prakash. The film is directed by Sachin Ravi and stars Shanvi Srivastava in the lead. I do not worry too much about the gap that everyone is talking about. I dont allow the tags associated with me to influence or pressurise me either. I only do films that I love, and it so happens that they are different, he says.

Rakshit gives credit to his growing-up years in Udupi for shaping him into what he is. Most films I write are based on what I have seen growing up, and I could write Ulidavaru Kandanthe because of that familiarity. Likewise, Kirik Party drew a lot from my engineering days. This connection to my roots has helped me tremendously as a writer. I believe writing is where it all begins; you can not make a film look different, you have to write it differently. And there is a little bit of the real me in every character I write, be it Richie (Ulidavaru...), Karna (Kirik Party) or Narayana (ASN).

The actor is among a handful of multi-taskers, who also writes, directs and produces films. If I had to rate them in order of preference, it would be writer, director, actor and then producer, he smiles. Rakshit took on that last role, because he believes it is his way of giving back to the industry and to keep the circle of kindness going by helping an emerging talent.

ASN has been trending on social media with its innovative promotional campaigns, featuring puzzles and number games. The first teaser released 18 months ago, the second six months ago, and recently the trailer. And, there was a connecting link that audiences were asked to guess. As an artiste, I like things to be interactive and like the audience to get involved in cracking the code, as it were. I like to show them my art, but hide a few golden eggs, laughs Rakshit, who cant not finish a puzzle, especially if it has to do with math and science. He also loves gaming, as is evident in the contests. When I get a day or two off, or when my nephew and niece visit me, I settle down with my PS4. My favourites are Assassins Creed and Far Cry.

In promotional interviews, Rakshit speaks little about the plot of ASN. I believe your content is the biggest publicity. If it is strong, you do not have to spend money to promote it. This is what I followed with Ulidavaru... and Kirik Party. I, however, work to ensure the audience emotionally connects with the film before release.

Ask Rakshit how he keeps his energy levels up when a project is spread over time ASN releases three years after Kirik Party and he says that it does not require effort, only a high degree of attachment. My stint in the industry is a dream. I have been given the privilege of living this dream and love it. After ASN, I will probably think of something bigger. When you are involved, you forget the concept of space and time and live in the now. I dont allow any other thoughts to enter my head the number game or who is doing what. This is not a competition and I like to have fun and be happy even as I work.

Helping Rakshit run his own marathon is his immediate creative eco-system, which is made up of like-minded people. I am clear that even if it is a friend, only those who are in sync with the film will work on it. It is difficult to give it your all, if you dont believe in what is being made. This is why I began Paramvah Studios too, because I thought it was unfair to ask producers to back a film they did not believe in. I wanted a Plan B so that, irrespective of what happens, the film will get made. I am comfortable working with Pushkara, because he is as passionate about his projects. Rishab Shetty is part of my team, then there is Hemanth Rao, whom I completely trust. Kiranraj (who directed Sagara Sangama in the anthology Katha Sangama) is directing Charlie 777; we have worked together for long, and when he narrates something, I know how he will direct it.

On sets with younger directors, Rakshit does not see himself as a mentor, but as a team player. For me, all this is part of a never-ending education. I read a lot and love to share what I read. I never went to a film school or worked under a director. I learnt filmmaking reading books and making short films. When I share what I learn, it is reinforced and results in a lot of discussion. We all learn, and that is paramount, says Rakshit, who is now writing his dream project Punyakoti, and awaiting the scripting of Richie.

All about dialects

Rakshit is part of the small group of filmmakers who are drawing attention to the lyrical, sing-song Kannada of Dakshina Kannada region. In my initial days in Bengaluru, I realised that everyone loved my Kannada, though I spoke the same content as them. I also believe that tinge of humour we have in Mangaluru Kannada can not be found elsewhere. I know this language, and use it on screen when I can. But, it is important to know a culture before one presents it. North Karnatakas Kannada is very different, and only someone from there can do justice to it in a film. Films are a documentation too, and I believe every dialect has to be explored in some way in cinema.

Here is the original post:

Rakshit Shetty talks about 'Avane Srimannarayana' and the eco-system that enables his work - The Hindu

Are Apple Macs worth the money? – scallywagandvagabond

Should you buy an Apple Mac? Image via Unsplash.

If youre in the market for a new computer, you might well have glanced over at those shiny Apple Mac desktops and laptops at the store. Lets face it; they are designer technology, but are they also practical too? And, are they even worth the price premium compared to Windows-based systems?

You might think that Apple Macs are overpriced and seldom-used by most people. But youd be wrong! Believe it or not, they are worth the money, even if the price ticket leaves you feeling somewhat shocked! Heres why it makes sense to put some serious consideration into buying an Apple Mac:

One of the brilliant things about Apple technology is that it all works seamlessly well in its ecosystem. For example, if you had an iPhone and Mac, and youre working at your Mac, it will alert you when you get a call or message on your iPhone. Its quite a handy feature to have, especially if you tend to leave your phone on silent! You can also write and reply to text messages on your Mac, and even take calls on it too if your iPhone is nearby.

Because they are all part of the same Apple ecosystem, sharing content between the devices is simple. That means you could, for instance, open up an image on your Mac that was taken with your iPhones camera.

The shared ecosystem concept is something Microsoft finally got around to mimicking but without much system. Google, to some degree, has attempted to achieve a similar idea with Android and the Google account. But, there isnt a high takeup of Googles Chromebook devices compared to regular tablets and laptops.

Apple Mac computers dont use Microsoft Windows, but rather an operating system called macOS (formerly OS X), a hybrid version of UNIX. As with Windows, macOS is a closed operating system meaning Apple doesnt share much of the source code to it. Because of that fact, they are solely in charge of issuing software updates and bug fixes to macOS.

Historically, the same cannot get said for Microsoft Windows when it comes to quick updates and bug fixes. So, if you want a computer that has an up-to-date operating system, youre better off choosing a Mac.

Believe it or not, the majority of Apple Mac desktops and laptops will outlive their PC equivalents. There are several reasons why that is so:

Many PC users dont like to admit it. But, theres always the thought in the back of their minds of whether their computers will operate as expected each time they use them. Fortunately, with an Apple Mac, you dont need to have such a lack of confidence! Thats because they do exactly what is expected of them, making you feel confident that you can continue your work on them without issue.

On the rare occasion that your Mac has a technical hiccup, its pretty easy to diagnose the source of the problem and get it sorted. With a Windows-based system, the issue could be down to virtually anything! As such, racking down the culprit can sometimes take a long time.

Lastly, if youve never used a Mac before, one thing you will notice is that it doesnt feel cheap and plasticky! Yes, they are expensive machines, but they are high-end premium products. If you owned a Mac from new for five years, it would still feel reliable and durable just like it did the first day you used it.

Read more from the original source:

Are Apple Macs worth the money? - scallywagandvagabond

Silicon Labs and Z-Wave Alliance Expand Smart Home Ecosystem by Opening Z-Wave to Silicon and Stack Suppliers – PRNewswire

Expected to be available in the second half of 2020, the opened Z-Wave Specification will include the ITU.G9959 PHY/MAC radio specification, the application layer, the network layer and the host-device communication protocol. Instead of being a single-source specification, Z-Wave will become a multi-source, wireless smart home standard developed by collective working group members of the Z-Wave Alliance. With more than 100 million interoperable devices deployed, more than 3,200 certified products and over 700 member companies, Z-Wave has the most mature and pervasive smart home ecosystem in the market.

Alliance members and smart home consumers will benefit from the hallmark features of Z-Wave, including interoperability, backwards compatibility, the S2 security framework, easy installation with SmartStart, low-power functionality with a 10-year battery life and long-range with sub-GHz mesh. The Z-Wave Alliance will maintain the certification program and expand the offering to provide technology vendors with both hardware and stack certification and product manufacturers with application layer certification.

"As a standards organization, the Z-Wave Alliance will help solve the interoperability challenges hindering the adoption of smart home devices," said Mitch Klein, executive director for the Z-Wave Alliance. "Members will work together on a single sub-GHZ connectivity solution that guarantees the forward-and-backward compatibility, interoperability, security and robustness needed to grow the IoT. The Z-Wave Alliance will collectively advance a fully realized smart home standard."

Silicon Labs is committed to IoT standardization. By expanding access to Z-Wave as a standard supported by multiple vendors, the smart home ecosystem will benefit both from broader technology support as well as accelerated market adoption.

"Silicon Labs has worked to create positive alignment across the industry with the goal of advancing both security and compatibility in smart home devices," said Jake Alamat, vice president and general manager of IoT home and consumer products at Silicon Labs. "Future success for the smart home industry relies on ecosystems getting closer, not farther apart. The smart home market opportunity is tremendous, and we want to help drive its success. When the ecosystems work together toward a common goal, the entire industry including manufacturers, developers, retailers and consumers benefits from this open collaboration."

"As an early adopter of Z-Wave technology, we welcome this move by Silicon Labs,"said George Land, Z-Wave Alliance board member and general manager of digital products at Trane. "Enabling an even broader ecosystem of interoperability will bolster both consumer and manufacturer confidence, driving overall growth of the industry."

Silicon Labs will continue to invest in Z-Wave technology and contribute to its future growth, collaborating with new suppliers through the expanded Z-Wave Alliance. Development on the opened Z-Wave Specification will be managed by the new working groups in the Alliance in Q3 2020, and details on the silicon and stack platform certification program also will be announced in Q3.

To learn more about Z-Wave technology, please visit silabs.com/z-wave. For more information about Z-Wave Alliance membership, please visit z-wavealliance.org/z-wave-specification.

Companies interested in joining the Z-Wave Alliance can also visit the organization's booth at CES 2020, Sands #41917.

About Z-WaveZ-Wave technology is an internationally recognized ITU standard (G.9959). With more than 3,200 certified interoperable products worldwide, Z-Wave is the leading wireless home control technology in the market today. Represented by theZ-Wave Allianceand supported by more than 700 companies around the world, the standard is a key enabler of smart living solutions for home safety and security, energy, hospitality, office and light commercial applications.

Silicon LabsSilicon Labs (NASDAQ: SLAB) is a leading provider of silicon, software and solutions for a smarter, more connected world. Our award-winning technologies are shaping the future of the Internet of Things, Internet infrastructure, industrial automation, consumer and automotive markets. Our world-class engineering team creates products focused on performance, energy savings, connectivity and simplicity.silabs.com

Connect with Silicon LabsSilicon Labs PR Contact: Dale Weisman +1-512-532-5871, dale.weisman@silabs.comFollow Silicon Labs at news.silabs.com, at blog.silabs.com, on Twitter at twitter.com/siliconlabs, on LinkedIn at linkedin.com/company/siliconlabsand on Facebook at facebook.com/siliconlabs.

Z-Wave Alliance PR Contact: Caster Communications +1-401-792-7080, zwave@castercomm.com

Cautionary Language This press release may contain forward-looking statements based on Silicon Labs' current expectations. These forward-looking statements involve risks and uncertainties. A number of important factors could cause actual results to differ materially from those in the forward-looking statements. For a discussion of factors that could impact Silicon Labs' financial results and cause actual results to differ materially from those in the forward-looking statements, please refer to Silicon Labs' filings with the SEC. Silicon Labs disclaims any intention or obligation to update or revise any forward-looking statements, whether as a result of new information, future events or otherwise.

Note to editors: Silicon Labs, Silicon Laboratories, the "S" symbol, the Silicon Laboratories logo and the Silicon Labs logo are trademarks of Silicon Laboratories Inc. Z-Wave is a registered trademark of Silicon Labs and its subsidiaries in the United States and other countries. All other product names noted herein may be trademarks of their respective holders.

SOURCE Silicon Labs

http://www.silabs.com

Read the original post:

Silicon Labs and Z-Wave Alliance Expand Smart Home Ecosystem by Opening Z-Wave to Silicon and Stack Suppliers - PRNewswire

Transforming the insurance sector to an Open API Ecosystem – Finextra

1. Introduction

"Open" has recently become a new buzzword in the financial services industry, i.e.open data, open APIs, Open Banking, Open Insurance, but what does this new buzzword really mean?"Open" refers to the capability of companies to expose their services to the outside world, so thatexternal partners or even competitorscan use these services to bring added value to their customers. This trend is made possible by the technological evolution ofopen APIs(Application Programming Interfaces), which are thedigital portsmaking this communication possible.

Together companies, interconnected through open APIs, form a trueAPI ecosystem, offering best-of-breed customer experience, by combining the digital services offered by multiple companies.

In thetechnology sectorthis evolution has been ongoing for multiple years (think about the travelling sector, allowing you to book any hotel online). An excellent example of this trend is the success story of Uber. In just a few years this company has acquired a market capitalisation larger than that of BMW. This while Uber mainlycombines multiple API services offered by other companies, i.e.

Positioning is done by the operating system (iOS, Android)

Route calculation and maps are provided by MapKit and Google Maps

Twilio sends real time text messages to the customers

Payment is handled by Braintree

The receipt is sent via Mandrill

The services are hosted in the cloud on Amazon Web Services (AWS)

Combining these best-of-breed API services allows start-ups like Uber to deliver anexcellent and innovative user experiencein a very short time frame, thusfacilitating rapid growth.

Afterwards these start-ups will typically deliver their own APIs, which in their turn are integrated in the offering of other companies. E.g. the API of Uber is also integrated in the application of United Airlines.

These examples show themutual benefits of such an open API ecosystem, i.e. the customer-facing company can deliver additional services to its customers, while the service-providing company can profit of an increased usage (and monetization) of its APIs (and underlying products/services). This leads for both companies to increased revenues.

The example of Uber is certainlynot an isolated case, e.g. UPS has successfully increased its market share by integrating its APIs in online webshops or eBay generates already 60 percent of its revenues via its APIs (e.g. API to submit item for listing on eBay).

The insurance industry, traditionally quite slow in integrating new technologies, will also be more and more impacted.Open Insuranceis becoming an emerging trend, pushed by increased and changing customer needs and InsurTech competition.

This article describes theimpactsof this trend on theinsurance industry.

2. Drivers

While Open Banking has been a hot topic for a while, the trend towards openness has also exponentially increased in the Insurance Industry (i.e. Open Insurance or API insurance). Just as for Open Banking, this is driven bymultiple evolutions in the insurance industry:

Customer needs are increasing and changing: customers are demanding a multi- and cross-channel experience, which is real-time and 24/7 available. Furthermore, the experience should be customer-centric, rather than the product-oriented approach that most insurers currently offer. E.g. customers no longer want to buy a standardized insurance product, but instead want to input the risk for which they want to get insured and receive a tailor-made offer of their insurance company.Typical examples of this trend are micro-insurances (i.e. small amount insurances only applying to 1 object and/or for a short time period), peer-to-peer-insurances, usage-based insurances (UBI) and the rise of super-apps (i.e. apps of large tech giants which act as a central platform to initiate any customer journey) and the request of customers demanding a stronger customer engagement with their insurer. While in the past customers only met with their insurer when opening an insurance contract and when filing a claim, insurance companies now realize they should significantly increase the number of contact points with their customers. This will force insurers to evolve to a service-company, offering different tools, typically linked to the prevention of the insured risk (i.e. preventing a claim).

Regulatory pressure: a continuous flood of new regulations makes that insurers IT and operations departments are overloaded with making existing processes compliant with regulations. This makes that very little resource capacity and budget remains to work on innovative services, especially as most insurers also have several digitalization and operational excellence projects to reduce operational costs (vital as revenues are dropping due to increased competition and low interest rates). Insurance companies that wants to innovate need therefore to work out partnerships with other parties.

InsurTech competition: new - so called InsurTech - entrants, which can deliver innovative customer services faster and cheaper, are disrupting the market (in 2018 7.6 billion U.S. dollars was invested in the InsurTech sector). Following the example of the banks, insurance companies have learned its far better to partner up with a few well-chosen InsurTech companies to deliver attractive services to their customers, than to compete head-on with them. This approach enforces insurers to setup an open API architecture, which facilitates the rapid integration (plug-and-play) of the insurers and InsurTechs service offerings.

Hungry for data: Insurance companies are hungry for data. The more data an insurance companies has about the customer and the object to be insured, the more accurate the insurer can fine-tune its actuary risk models and consequently its insurance pricing (dynamic pricing model). Insurance companies should therefore integrate a maximum with external providers to get the most accurate view on the insurance risk. Especially with the rise of Big Data and AI, insurance companies are now in the capacity to process and analyse this inflow of data and transform it into actionable results.

Rise of digital brokerage: until recently most insurances were sold via brick and mortar insurance brokers. Today more and more insurances are being sold over the internet, via digital insurance broker platforms (i.e. online aggregators, providing a comparison of different insurances) and via e-commerce platforms, which forces insurance companies to integrate in a very cost-efficient way with these different distribution platforms.

New technologies: the rapid technological evolutions in the industry (IoT, big data analysis, real-time customer analytics, AI, block chain) make it almost impossible for an insurance company to invest (and be at the top) in every new technology. Partnering with specialist companies (integrated through an open API architecture) is therefore almost a necessity to stay ahead of all those technological evolutions.

New architectural design principles: historically the application architecture in the insurance industry is composed of several large, very closed, monolithic legacy systems. This traditional architecture is reaching its limits in the current digital and fast-moving world. Insurers are therefore taking their first steps in the migration to a microservices based architecture. Since microservices communicate with each other through well-defined APIs, this architecture can be exposed to the outside world much cheaper and quicker than a traditional architecture.

3. Impacts on the Insurance industry

The insurance landscape is undergoing its most fundamental transformation in decades, driven by the fast digitalisation in the sector. Just like the entertainment, media and retail industry, the internet has changed the way of doing business completely. Insurers should not only open their services, but also build their own digital ecosystems and participate in external ecosystems. Insurers therefore need to transform themselves to an "Open Insurer", offering "Insurance as a Service (IaaS)" (i.e. white label insurance products).

Ultimately insurers shouldshift from building full end-to-end insurance solutions to assembling best-of-breed insurance servicestailored to meet the customer needs. This means that the traditional product-centric distribution should be transformed to services providing deep financial insights and integrating services of other sectors. This can only be achieved by creating anopen API ecosystem, which is beneficial for all involved parties.

In practice, this API ecosystem digital platform would resemble an "App store" with services offered by the different parties involved in the ecosystem. Thecustomerwould be in thedriving seatto choose the functionality/service and user interface that suits him best. Once having made this choice, the customer would give a consent to the party to use specific data present in the ecosystem. Since the customer can easily switch from one service to another, this willboost innovationconsiderably and result in new service offerings which are superior in terms of cost, performance and convenience.

Open Insurance will also have a substantial impact on the insurers organisation wherebridges between business and IT departments will be alleviated, as decisions will need to be made quicker due to the faster evolving technologies and customer expectations. Thanks to Open Insurance, business and technology needs will become further aligned urging business analysis and software assembly and implementation to run in coexistence.

4. Opportunities and Threats for the Insurance industry

The creation of open API ecosystems offers severalopportunities, but also significantthreats, to the insurance industry.

Insurance companies not opening their architecture and not participating in these API ecosystems are expected to lose the most. Interesting to quote the BBVA bank here: "A company without an API is like a computer without internet".

Even when insurance companies build out more engaging services, it is unlikely that customers will choose the app of an insurance company as a central access point to other services and products.

Insurance companies will therefore profit most of the API ecosystem by:

Utilising more datafrom a broader external ecosystem

Sharing their own data and algorithmswith the rest of the world

Sharing their product stackwith the rest of the world

In the next chapters, we will present a few examples of how these 3 approaches can benefit insurance companies.

4.1. Utilising more data from a broader external ecosystem

As mentioned above, the business of insurances is adata-intensive business. Collecting large amounts of data and transforming them into actionable results is a core business of an insurance company. Thanks to the digital revolution insurance companies have now access to an almostunlimited supply of data, so choosing the right data sources and setting up in a cost-effective way, the data pipelines to capture, transform and process this data will be a key challenge for each insurer.

Some examples of data sources which could result in valuable insights for insurers:

Open Banking data: Thanks to the EU PSD2 initiative and the Open Banking directive in the UK, customers account data is now available to be accessed by TPPs (Third-Party Providers). This will allow insurance companies to get information about the financial situation of its customers, but also to get valuable insights in the type of income and expense transaction the customer executes.

IoT data: the rise of "Internet of Things" (IoT) can revolutionize the insurance industry, as it facilitates usage-based insurance (UBI), dynamic risk modelling and dynamic insurance pricing. Typical examples of insurances linked to IoT are:

Car insurance: via a continuous monitoring of the drivers behaviour and driving habits, car insurances can be dynamically adapted. This monitoring can be done via an onboard diagnostics (OBD) device installed in the drivers vehicle or via the drivers smartphone. Based on this data collection several services and enhancements can be offered:

Flexible pricing: reward safer driving through lower premiums.

Improved fraud detection: identify on which spots the car is parked (during the day and at night), if a personal vehicle is not used for professional services (like driving a delivery route), how many kilometres is actually driven with the car and if no claim insurance fraud is committed by comparing the car data with the data entered in the claim report (e.g. check if car was actually present and had a strong break at the report crash location).

Inform customersof risky situations for the car, e.g. notify customer about bad weather expectations in the neighborhoud of the car (e.g. hail), notify customer when he parks car in a neighborhood with a high amount of reported thefts

Provide customer (a game-like) overview of his driving statistics, like speed, kilometres, with support tools like simulating future gasoline cost

Support the customerin case of car breakdown, accident or theft. E.g. insurance company can pro-actively contact the client and arrange emergency support (in case of injuries) when it detects an accident, automatic pre-filling of a digital claim (based on the collected data) in case of an accident, support in arranging car assistance (e.g. Touring assistance) when car breakdown is identified, identify theft of a car when unexpected (deviating) driving behavior is recognized

Allow parents tomonitor(track) their teenagers when they use the family car

Supportshort-term car insurances, allowing policyholders to insure themselves on a friends vehicle for a short period or allow drivers to buy a short period covering on their own vehicle

Home insurance: Improve the protection of insured houses against threats (fire, leak, flood and theft), thus reducing the risk for insurance claims.

Life insurance: wearable sensors (e.g. Fitbit) can be used to monitor health activities and communicate the results back to the company for lower life insurance premiums. Different biometric readings can be collected, like heart rate, body temperature, blood pressure, movement, calorie burn-rate, alcohol consumption., which can be used to gamify healthy habits into a point system. Furthermore, insurance companies can provide services for elderly (assisted living) for safety and care (e.g. check if customer is properly taking his required medication).

Location data from mobile phone: sending location data from the mobile phone to the insurer, can not only allow insurers to get a better idea of the risks a specific customer is taking, but also allows to provide extra services and cross-selling opportunities to the customer. Some examples:

When customer is driving to the airport or is located abroad, but does not have a travel insurance opened yet, the insurer could propose him to open a travel insurance.

When customer is driving to the hospital, the insurance company could inform the customer about the modalities of his hospital insurance.

Based on the combination of climate data and customer geolocation data, the insurer can offer hurricane alerts

When customer is abroad and from other data, it can be derived that customer is abroad for holiday, this info could be used to send customer a "Happy holiday" card and to make sure the customer is not contacted at that moment.

Social media: insurers can obtain valuable data about their customers from social media like Facebook, Twitter, LinkedIn

Customer referential data: when insurers can be informed about changes in customer data well upfront (by integrating with postal service, social media or governmental services), this provides a lot of opportunities to insurance companies:

Achange in addresscan be an interesting sign to contact a customer to sell new or revise existing policy(ies). Not only for a home insurance, but also for other policies like a car insurance (change in address can result in different transportation habits, if e.g. less access to public transports) opportunities exist. It can even be a sign of a relationship break-up or a child leaving the parental house, in which case a full revision of the insurance portfolio is required.

Thebirth of a childis also a moment for revision, typically for family liability insurances or hospital insurance. Same applies for achange in civil status.

Valuation services: in order to properly assess insurance risk, a correct valuation of the insured object is also required. Most insurance companies have good models for this, but external services can also be used to provide an accurate initial valuation, but also to review regularly the current valorization. Some examples:

Car insurance: call external services to determine value of a car to determine the insurance premium, but also valorization of damage in case of a claim. Examples of such services are "cars.com", "vinaudit.com", Edmunds, Informex

Real estate: allow to valuate a real estate property, e.g. Rets.ly, SimplyRETS, Rets Rabbit, Property API, Zillow

Art: valuation of art objects, e.g. "artnet.com", "artprice.com", "valuemystuff.com"

4.2. Sharing their own data and algorithms with the rest of the world

Insurance companies that collect (some of the) above data from external sources and combine it also with the rich internal data sets (customer referential data, policy details, claims data) and process it an efficient way, hold valuable insights which can also be commercialized to other (commercial) parties. This chapter provides a few examples how insurance companies canshare their data and algorithms with other parties:

Insurers have worked outsophisticated models for fraud detection, customer risk assessment and valorization of insured objects, which can be exposed (and monetized) to 3rd parties.

Competitors taking over a car insurance, will be very interested in obtaining the claim history of the customer

In order to properly insure a car, insurance companies could request customers to provide thename of the driverfor any drive (especially for company leasing cars, rental cars and car sharing services). This information could be useful for the police and parking companies, to send fines and bills directly to the right person.

Instead of the traditional Green Card, insurance companies can provide adigital Proof of Insurancevia an API. This would allow the police, ANPR cameras and technical inspection companies to directly check all details of a car insurance policy and its holder.

Aggregated views of customers assets, liabilities and off-balance positions, like PFM tools, financial or pension planning tools, account aggregators, aim to provide a holistic view of the customers wealth. Those tools are very interested in obtaining (after customers consent) a view on the customers policies and outstanding balance for life insurances.

Insurance planning and aggregation tools, which allow customers to assess risks and open automatically insurance policies to protect against (e.g. UnderCover from Ensur)

When insurance companies collectdriving dataabout customers, this data could also be shared with other interested parties, like for road pricing (road tolls, distance or time-based fees, congestion charges), car sharing and rental services, fleet managers

4.3. Sharing their product stack with the rest of the world

Of course, the most interesting of the 3 categories is where an insurance company opens its product stack to the outside world, as this can directly generate revenues for the insurance company. We identify 2 categories here: integrations for origination of new insurance policies and integrations for servicing existing insurance policies. Some examples:

Origination of new insurance policies: by providing APIs to other industry apps, an insurance company can obtain new customers who were not even thinking about the related insurance aspects:

Car dealer apps: sell car insurances

Luxury good dealer apps: sell policies to insure specific object or allow to review home insurance

Real estate apps: sell home insurances

IT protection websites like firewalls, virus-scanners: sell cyber-security insurances

Travel apps: sell vacation/travel insurances

e-Commerce apps: sell insurances for delayed or no delivery (insure the company selling product or customer receiving the product)

Apps that sell (perceived) risky activities, like extreme sports, parachute jumping, flying, travel to dangerous area: sell short-period life insurances (e.g. Sure provides micro-duration life insurance coverage during air travel)

Insurance comparison tools, e.g. BusinessComparison, Moneysupermarket, Confused, Compare the Market, MoneySavingExpert

Servicing existing insurance policies: provide different APIs to act upon existing insurance policies:

Apps for "safe driving courses", "stop smoking therapy", "sport/fitness clubs" could provide a service to directlyreview insurance premium pricing(of car, hospital, life insurance)

Automatic filing of a claim, e.g. when travel is cancelled, when a package ordered on the internet can not be delivered, when a plane is delayed

Block (and unblock) an investment insurance policyas collateral for a credit sold by a bank (cfr. LABL product of Capilever). This could allow a customer to open a cheaper consumer credit, as it is backed by the money of an insurance policy. This could also be used for mortgage credits, backed by a group insurance acting as debt balance insurance.

Pay-out a life insurance: bank or notary receiving info of a decease could provide automatic instruction to pay-out life insurance.

Adapt details of an insurance policy, e.g. fleet managers or car rental/sharing services automatically adapting the driver linked to an insurance or social secretariats, which can automatically add extra employees to group insurances or add/remove partner or children of an employee to/from a hospital insurance

5. Success Stories

Even though the insurance sector is only in the beginning of its transformation to an Open Insurance API ecosystem, several examples ofsuccessful API ecosystemscan already be identified today:

In 2008, PolicyBazaar was founded in India, as an online platform that aggregates insurance plans and serves as a marketplace for policies.

In 2013, Ping An, Tencent and Alibaba joined forces to launch Zhong An, which is an online-only property insurance company, selling high volumes of low cost micro-insurances (e.g. cracked screen insurances, flight delay insurances, shipping return insurances), via China' biggest tech companies. With over 630 million insurance policies and 150 million clients serviced in its first year of operation, this is definitely a success story.

In 2014, Ping An (largest insurance company in the world) created Ping An Good Doctor, which is a healthcare ecosystem for the Chinese market. With 265 million registered users, it is the largest mobile medical application in China.

In 2016, AXA partnered with Silicon Valley InsurTech Trv to attract the British millennials. Trv allows customers to buy flexible, short-term insurances for single items via their smartphones. For example, a customer can open a temporary insurance for an expensive camera, when he wants to use it.

In 2016, American InsurTech company Lemonade launched its online-only insurance app for renters and home insurance policies. Currently valued at more than $2bn, the company disrupted the American insurance industry, via fully digital and cheap insurance policies.

Read more:

Transforming the insurance sector to an Open API Ecosystem - Finextra

Why the $52K Mac Pro is important for everyone who cares about the survival of the Mac ecosystem – ZDNet

Let me be clear here. I don't ever see myself buying the new Mac Pro. That's not only because trying to justify buying a computer that would cost more than putting an extension on the house wouldn't be possible, it's because I don't actually need it. Even so, I'm very, very comforted by the knowledge that it exists.

In fact, I'm quite content with my new Mac mini and my well-equipped repurposed 2013 iMac which now lives in my family room as a development machine. That's because I need 32GB of RAM to run my development system, video editing environment, and gaggle of virtual machines. I don't need 1.5TB.

And yet, the existence of a $6,000-$52,000 Mac Pro adds to my sense of existential security -- and if you're a Mac user, it should add to yours as well. Here's why.

Everyone reading ZDNet clearly understands the concept of a computing platform. It's a hardware/software environment that we build solutions on top of. iOS and Android are platforms, and they support (more with Android, of course) a variety of mobile solutions.

Linux (and, to some extent, BSD), Windows, and MacOS are platforms, and they support general purpose computing, ranging from running little point-of-purpose Raspberry Pi servers up to all our desktops and laptops, up through the giant server farms at Google, Amazon, and Facebook.

Now, the fact is (with a few limited exceptions), no one is going to run a Facebook-scale server farm on Macs, not even Apple. But desktop and workstation computing? That's definitely the domain of Macs.

As we all know, the desktop computing market has changed considerably in the past decade. Many consumers who need mostly to communicate and consume data have moved off traditional desktops and laptops to smartphones and tablets.

But even as the needs of many consumers have been met by simpler-to-use mobile-centric devices, the needs of workers and professionals have continued to grow.

As recently as 2013, I said I didn't need the then-Mac Pro, because I didn't see myself needing to do video editing or 3D modeling, two tasks that require a lot of computing resources. Fast forward six years, and a huge portion of my workload involves video editing and 3D modeling.

When buying a computer for business use, perhaps the single most important factor is understanding the intended workload. If you're traveling all the time and you want to be able to write and respond to email, a tablet or a small MacBook Air-sized machine is fine. If you're doing animation for a big-budget blockbuster movie, a MacBook Air or a Microsoft Surface would simply melt under the load.

In business, we choose the computer we're using based on our expected workload over the next 2-3 years. We choose the computer platform based on our expected workload over the next 5-10 years.

This is a critically important distinction. When we choose a platform, whether that's Windows, Mac, or Linux, it's because we're planning on investing in software and skills that we expect will stand the test of time. It's fine to upgrade a machine, but if you have to migrate a software platform, that's a lot more work, if it's even possible.

Changing from one machine to another on the same platform is a day or two of work. Migrating from one platform to another is a staged battle that could take a year or longer.

It's the platform migration problem that makes the Mac Pro, especially at it's ear-bleedingest high end, so important to Mac users. Put simply, the Mac Pro future-proofs the platform when it comes to workload demand.

It's all about headroom. When we choose a platform, we're not just thinking about the machine we're using now, but whether that platform can service us throughout our scope of work and beyond. Building a piece of software can take a few years. Making a big-budget movie can take three to four years. Designing a new car can take a decade.

When we choose a platform, we want to make sure it will work for us throughout all that time. That means it's important to know that as our needs grow, our platform can meet those needs.

Before I go on, it's important to say that many of us use two or more platforms. I regularly switch between Mac, Windows, and Linux. I use Mac for most of my daily work, particularly my heavy workloads. I use Linux on all my servers. And I use Windows for some business applications that don't have Mac implementations. And let's not forget the cloud. The cloud is its own platform and turns desktop platforms into engines that run browsers.

So, it's not just about choosing between Mac, Windows and Linux. It's about whether the platform you're using for a specific workload can scale with that workload. For me, the Mac needs to scale for my development environment and video production needs, while Linux needs to scale with the load on my servers. Windows just needs to keep running all that Windows-only software.

From about 2014 to about 2018, it was completely unclear whether or not Apple cared about providing its Mac customers with that headroom. Machines generally went un-updated. It took six years for a new Mac Pro, for example. It was unclear that pro users would have enough power in the Mac platform to take us where we needed to go.

This was an existential question. If the platform wouldn't grow with our professional needs, then the platform would have to go. The gotcha, at least for me, is that there are applications on the Mac platform that don't exist elsewhere. While I can do the same work on Windows and Linux as on Mac, I can't do it as quickly. In fact, I save two-to-three days a week using Mac apps. That's measurable.

But if Apple was abandoning Macs -- and it sure looked that way in 2017 -- then I and many other Mac-using professionals would have had to begin looking at a long migration process.

That all changed in 2018. The company finally introduced a laptop that had more than 16GB RAM. The iMac Pro was in active use by many pros. The Mac mini got a very long-overdue overhaul. And Apple announced the new Mac Pro.

This was big, not because we all just wanted to spend money, but because it meant that we had more runway with our workloads. Those of us relying on the Mac platform did not have to begin developing a migration strategy.

Key to this was the high-end Mac Pro, which tops out at a whopping $52,000. It's not, as I mentioned at the beginning of this article, that I have a need for it. Most business users won't. But you don't choose a platform based on what you need now.

Some folks, today, need a machine with 1.5TB of RAM. Others need to know that such a machine is available, even if we never expect to use one. The creativity, software support, market engagement, and robust project environments that will result from the high end machines and their users are of big benefit to all Mac users. The Mac Pro promises that those who will need to go there in the future actually can.

Oh, and for those who think that $52K is at the top of the dollar spectrum for PCs, you'd be wrong. For kicks, I just spec'd out a $162K (and that's after $69K in discounts) Dell 7920 Tower Workstation. While you and I might never spend that on a single PC, some folks need all that capability for their workloads.

That said, the four hundred dollar upcharge to add wheels to the Mac Pro is just rubbing it in.

You can follow my day-to-day project updates on social media. Be sure to follow me on Twitter at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.

Link:

Why the $52K Mac Pro is important for everyone who cares about the survival of the Mac ecosystem - ZDNet

Apple Issues New Blow To Google With This Bold Security Move – Forbes

Apple is cementing its place as the company of choice for those who care about security and privacy ... [+] with a bold move that hits out at its biggest smartphone rival Google.

Apple has already been very vocal about the security and privacy built into its iOS 13 operating system update, which hits out at firms such asGoogle and Facebookby limiting the data they can collect. After making abold privacy movea month ago, Apple is now doubling down on security, by launching a newPlatform Security Guidedetailing how its iPhones, iPads and Macs are more secure than Googles Android devices, because the firm owns the whole ecosystem.

Apples devices have always been regarded as more secure, because Apple owns the hardware, software and apps. In contrast, although its biggest smartphone rival Google does make some of its own Android phones and has a level of control over its app store, the often separated hardware, software and platforms can make things very fragmented and posesecurity risks.

Apples security guide for Fall 2019 doubles down on how Apple keeps your devices and data secure across iOS and MacOS. It covers hardware security and biometrics such as Face ID and Touch IDwhich isthought to be returning with the iPhone 12next yearamong other areas.

The Platform Security Guide reads: Every Apple device combines hardware, software, and services designed to work together for maximum security and a transparent user experience in service of the ultimate goal of keeping personal information safe.

Custom security hardware powers critical security features. Software protections work to keep the operating system and third-party apps safe. Services provide a mechanism for secure and timely software updates, power a safer app ecosystem, secure communications and payments, and provide a safer experience on the Internet.

Apple devices protect not only the device and its data, but the entire ecosystem, including everything users do locally, on networks, and with key Internet services.

As part of the guide, Apple emphasises its commitment to securitywhich could be seen as a direct swipe at Google andFacebookas companies that have seen their own share of data and security scandals. Apple points to its bug bounty program, which is now open to all ethical hackers, and dedicated security team as reasons it is more secure.

But at the same time, its important to note that Apple isnt perfect: it cameunder fire from lawmakersrecently after it emerged that the firm wasnt applying the same controls to its own apps that it applies to others. With this in mind I created a useful guide tosecuring your apps in iOS 13, including Apples.

Anothercool new feature in iOS 13.3is the ability to use security keys with your iPhone in Apples Safari browser. I wrote an article includingmore information and a video demoon how to use it.

View post:

Apple Issues New Blow To Google With This Bold Security Move - Forbes

The Top 3 Fintech Stocks to Buy in 2020 – Motley Fool

Financial technology, or fintech, has experienced fantastic growth over the past decade, transforming the way many financial tasks have traditionally been accomplished. This ranges from banking apps and robo-advisors to commission-free trading and mobile payments. Searching for investments in rapidly growing sectors, such as fintech, can be a great way to produce market-beating returns.

For investors looking to juice their investment returns in 2020, here are three fintech companies that just might get the job done: Broadridge Financial Solutions (NYSE:BR), Global Payments (NYSE:GPN), and Square (NYSE:SQ). Let's take a closer look at each to see why I believe these stocks will provide great returns for investors in 2020 -- and beyond.

Fintech is changing the financial industry from the inside out. Image source: Getty Images.

Broadridge Financial delivers mission-critical services to asset managers, banks, brokers, and other financial industry players. Among the many necessary services the company provides are "investor communications, securities processing, data and analytics, and customer communications solutions." The company operates two business segments, Investor Communication Services (ICS) and Global Technology Operations (GTO).

ICS makes up the bulk of Broadridge's revenue, earning $3.5 billion in revenue in 2019, relatively flat compared to 2018's results, with $1.8 billion of this representing recurring revenue streams. This segment comprises the many platforms Broadridge provides to its clients, including ProxyEdge, an electronic proxy delivery and voting solution, and Matrix Financial Solutions, a mutual fund trade processing service. ICS also provides services to corporations that allow them to communicate with shareholders about annual meetings, financial reporting document management, and SEC disclosure and filing services.

GTO reported $954 million in recurring revenue in 2019, an approximate 5% increase year over year. As Broadridge's 10-K, its annual filing with the SEC, states, the GTO segment offers "advanced solutions that automate the securities transaction lifecycle, from desktop productivity tools, data aggregation, performance reporting, and portfolio management to order capture and execution, trade confirmation, margin, cash management, clearance and settlement, asset servicing, reference data management, reconciliations, securities financing and collateral optimization, compliance and regulatory reporting, and accounting."

Whew! These services allow asset managers to cost-effectively manage their books and records, allowing them to focus on their core competencies.

None of these services is sexy, but they are absolutely essential, meaning companies aren't going to stop paying for them during an economic downturn. Asset managers and financial institutions also face mountains of regulatory red tape over these types of communication and equity processing services, meaning they will think long and hard before switching from a proven vendor, such as Broadridge, to a competitor.

Broadridge is not for investors looking to get rich quick. Its revenue growth is lumpy and semidependent on the tuck-in acquisitions that management is prone to make. But the stock has outperformed the market over the trailing 10-, 5-, and 3-year periods, and I don't see any reason that outperformance is going to stop any time soon.

Global Payments' biggest event of the year was its acquisition of Total System Services for a cool $22.15 billion, the second-largest fintech acquisition ever. The deal combined one of the largest merchant processors, Global Payments, with one of the largest payment processors for card-issuing financial institutions, Total System. The combined company has already upped expectations for cost savings and revenue synergies the merger will recognize. It now expects to save $325 million annually and realize $125 million in revenue synergies within three years due to the combination. The acquisition creates a payments powerhouse that will process payments at more than 3.5 million merchant locations, facilitating 50 billion transactions per year, and recognize about $3.5 billion in adjusted EBITDA annually.

While payment processing is largely a commodity, Global Payments has managed to transform about 50% of its payments services into what it calls "technology-enabled solutions." Global Payments now owns eight industry-specific software verticals, and it's within these verticals that its payment processing services are embedded. These vertical software stacks make it much harder for customers to switch payment processing vendors in the future, giving them a lot more to consider than just price.

For instance, in August 2018, Global Payments acquired AdvancedMD for $700 million, a company that provides small- and medium-sized physician offices with cloud-based scheduling, billing, health record management, and payments solutions. Doctor offices that use AdvancedMD's software solutions would not want to swap out its entire back-office software just to switch to another payments vendor -- this comes with a high switching cost, something not usually associated with a commoditized service.

Square is another example of how a company innovated beyond basic payment processing to the point that it was offering much more than a commoditized service. When the company was founded, it offered a cool way for smaller merchants to accept card payments with a simple dongle that plugs into a mobile device, such as a smartphone or tablet. While this was innovative at the time, it was soon copied by a host of other payment processing services. Fortunately for Square investors, the company has since built out an entire ecosystem around its payment processing services.

This ecosystem includes:

Once merchants subscribe to one or more of these solutions, they will think awfully hard before leaving for another payment processing provider. No business wants to disrupt their payroll service just to save a few bucks on facilitating payments.

Not only do these services make Square's ecosystem sticky, but they are also incredibly lucrative and growing at a rapid clip. Subscription- and services-based revenue, where these services are accounted for, grew to $280 million in 2019's third quarter, a 68% increase year over year, and gross profit from this segment rose 82% to $216 million. As long as Square continues to innovate around building out a comprehensive ecosystem for small businesses, it should continue to experience explosive growth, fueling market-beating returns for investors.

More here:

The Top 3 Fintech Stocks to Buy in 2020 - Motley Fool

GW addresses report recommendations to improve research – GW Hatchet

Media Credit: Alexander Welling | Assistant Photo Editor

Miller said his office aims to increase corporate partnerships with University researchers to increase the revenue reaped from research.

The University has been working to fulfill recommendations laid out in the results of the first phase of a faculty-led research reviewreleased in April.

Officials said at a Faculty Senate meeting Friday that the University has upheld its commitment to provide institutional funding for research initiatives and has taken steps to boost undergraduate engagement in research since the ecosystem review launched last fall. Vice President for Research Robert Miller said he will continue to hire faculty with strong research backgrounds and initiate corporate partnerships that will provide researchers with greater revenue-building opportunities and exposure for their projects.

The strategic planning is two things, Miller said. Its one, thinking about what are our disciplines that were actually going to focus on and strive for preeminence, and then the other one is, How are you actually going to implement that?

The ecosystem review is one aspect of the Universitys multi-step strategic plan to bolster its research profile. In April, officials released Phase I of the review, which recommended that the University hire more postdoctoral fellows and improve communication between faculty, staff and students.

Miller said officials have extended funding for programs like humanities research that are not privy to as many external funding opportunities or grants as other programs.

One of the things that the president promised when he came in was that we would maintain support from institutional money to drive scholarship and research in those areas that are not easily attributable for external funding, he said.

Miller said the educational working group that conducted the review created the GW Student Research Commons this academic year, which allows undergraduates, graduate students and postdoctoral fellows to learn about research opportunities.

He said he plans to spearhead partnerships with corporations to build revenue for the school and expand the breadth of research faculty can conduct. Miller said corporate partnerships give researchers opportunities to share their discoveries with wider audiences than if they were conducting individual research.

Its a way of getting our ideas out into society, and I think thats the most important, he said in an interview.

Miller said entities that fund research efforts review research processes to ensure their money is spent the way they intended, and GW received a perfect score on the most recent review. He said the University should prioritize research integrity, or ensuring that research is completed in the way the researchs financier intended, moving forward from Phase I of the review.

Miller said reports from Phase II of the review are due to University President Thomas LeBlanc in February, at which point the review team will begin strategizing how to respond to the reports recommendations.

We cannot afford to have lapses in any of these areas, you cannot afford to have federal dollars that are not going to the work that they do proposed to do, Miller said.

Arthur Wilson, an associate professor of finance, asked Miller how he plans to bolster research in the School of Business, which was not included in the report.

Im from the School of Business, so I guess Im a little bit biased, Wilson said. The School of Business was almost non-existent on your presentation.

Miller said he plans to work with leaders in the School of Business to determine what research topics experts within the school are well-versed in and engage in research activities in those areas.

Its a question of identifying skills and opportunities, Miller said. We need to work with you, with the groups to bring together our expertise in identifying opportunities with your expertise and skill sets where actually can be utilized.

Holly Dugan, an associate professor of english, asked Miller how he plans to address the 30 unresolved issues in non-sponsored research that his presentation noted.

The report is very clear on what could address these issues, including non-release time based on actual marginal costs of teaching replacement centers that allow non-sponsored research participants to connect with one another, rather than cross-disciplinary, Dugan said.

Miller said he plans to form strong lines of communication between research project leaders and the deans of the schools in which the research is conducted to resolve issues.

Its more than that, because it is the interface between academic activity and the administrative activity of the ground support systems that need to come together, he said.

Continue reading here:

GW addresses report recommendations to improve research - GW Hatchet

Hyundai teases flying car concept ahead of CES – Autoblog

Hyundai will introduce a flying car concept at this year's Consumer Electronics Show as part of a larger vision for a future personal mobility ecosystem, the company announced Friday.

The car yes, we use that term loosely will appear alongside two more conceptual designs. The first, which Hyundai refers to as a "purpose built vehicle" concept, is little more than a flying cargo van; the second is the hub from which these hypothetical vehicles will operate.

The hub is key to Hyundai's vision. Placed throughout urban areas, they will serve as both access points for this quasi-decentralized transportation network and public gathering spaces for their surrounding communities.

"The use of airspace is expected to alleviate road congestion and give back quality time to city commuters," Hyundai's announcement said. It was otherwise light on details. We expect Hyundai will have more to say when it formally presents this vision to the public.

Automakers have expanded their presence at CES in recent years as they make more and more inroads into the mobility space, which encompasses everything from self-driving cars and e-bikes to broader concepts, such as urban planning and redevelopment.

Hyundai's vision for a connected future aligns with many other automakers', not to mention that of several non-automotive entities who are looking to get into the personal mobility game. Porsche announced in October that it has partnered with Boeing to develop what amounts to a flying car, and the latter's aerospace expertise makes their joint venture dubbed Aurora Flight Sciences one of the more promising in the industry.

Promising or not, aerial programs are still a long way off. Even by Aurora's (perhaps optimistic) estimation, demand for eVTOL vehicles won't start to pick up for at least five years. Regulatory realities could easily push that back years, if not decades.

Hyundai's concepts will be on display at the Mandalay Bay South Convention Center on Jan. 6.

Related Video

See the rest here:

Hyundai teases flying car concept ahead of CES - Autoblog

N.S. won’t protect land with ‘globally rare’ ecosystem that company eyes for golf resort – CBC.ca

In the tiny community of Little Harbour on Nova Scotia's Eastern Shore sits 285 hectares of coastal Crown land known as Owls Head provincial park. The name is misleading: it's not actually a provincial park and there are no obvious markings or trails to enter the coastal barrens and wetlands.

But the headland, which has been managed as a park reserve, is notable for some of its characteristics.

According to the province, it's one of nine sites in Nova Scotia with a "globally rare" ecosystem and home to several endangered species. For six years, Owls Head has been one of the provincial properties awaiting legal protection.

But that changed last March when, after several years of lobbying by and discussions with a private developer who wants to acquire the land as part of a plan to build as many as three golf courses, the Treasury Board quietly removed the designation, according to records CBC News received in response to an access-to-information request.

Thissets up the latest situation in Nova Scotia where conservation and environmental protection efforts appear poised to collide with economic development interests as the developer hopes to bring the kind of tourist attraction and job opportunities to the Eastern Shore that Inverness is realizing from the Cabot Links and Cabot Cliffs golf courses.

The decision to de-list Owls Head was made using a minute letter, which is protected by cabinet confidentiality and thus not available for the public to see. Government documents, however, make clear a plan which, until now, has been unknown to the public.

Lighthouse Links Development Company, which is owned by American couple Beckwith Gilbert and his wife, Kitty, is behind the proposal. They already own 138 hectares of land next to the Owls Head property.

Gilbert has a background in merchant banking and has been heavily involved in medical research.

He was not available for an interview, but in an emailed statement he said the couple fulfilled "a dream to own and preserve an unspoiled, natural ocean beach" when they started buying land in Little Harbour 16 years ago.

As he and his wife got to know the community, Gilbert said "it became quickly apparent that additional employment opportunities in the area were needed to encourage people to move to the Eastern Shore, rather than move away."

The idea for one golf course blossomed into two or three after talking to architects, he said.

"They emphasized that multiple adjacent courses were necessary to achieve profitable operations. Since we didn't have enough land for more than one course, we approached the province and proposed acquiring their unused adjacent land."

Gilbert's vision, according to a letter sent on his behalf to then-natural resources minister Lloyd Hines's executive assistant in 2016, which CBC obtained, is for something similar to the Cabot resort or Bandon Dunes golf resort in Oregon.

Chris Miller, a conservation biologist and executive director of the Nova Scotia branch of the Canadian Parks and Wilderness Society, called the proposal "deeply concerning."

Miller said the land is important because of how much there is, allowing for more extensive ecosystems than what is typically found along the coast and because so little of the province's coast is publicly owned and protected.

"It's a place where conservation values and nature need to come first and human and economic development is only within the context of protecting those values," he said.

Like Miller, Nova Scotia Nature Trust executive director Bonnie Sutherland said she had no idea aboutthe change in designation, which still has not been updated on the provincial website.

Sutherland's organization was intending to include Owls Head as part of its 100 Wild Islands project, which aims to protect the archipelago off the Eastern Shore. About 85 per cent of that work is complete.

"We're very surprised and disappointed," she said. "To lose that habitat is very significant."

Sutherland pointed to the endangered species known to live and, in some cases, nest on the land, including piping plovers and barn swallows. Other species of "conservation concern" known to be there, according to government documents, include the ruby-crowned kinglet and common eider.

There are "unique boreal and temperate plants and lichens" and Owls Head is one of nine sites in the province with "the globally rare coastal broom crowberry heathland ecosystem," said the documents.

But Lands and Forestry Minister Iain Rankin said the government was comfortable removing the designation because the land isn't a priority for legal protection.

By removing the designation, the government can now have the land appraised and begin more formal negotiations for a potential sale, said Rankin. The final decision was made by weighing the option to protect with the potential foreconomic development in a rural area, he said.

"There isn't high biodiversity value when you compare [it] to other pieces of land that we've advanced [for legal protection]," said Rankin. He said the decision would not affect the government's ability to reach its goal of legal protection of 13 per cent of all Nova Scotia land.

The golf project has had the support of Central Nova MP Sean Fraser andEastern Shore MLA Kevin Murphy.In August 2018, the company hired former provincialLiberal cabinet minister Michel Samson to lobby on its behalf.

Owls Head isn't the only Crown land Lighthouse Links wants to acquire.

The province is in negotiations with the federal government on behalf of the company to buy about 17 hectares of surplus Crown land next to Owls Head that's home to an automated light beacon and helipad. Ottawa would keep 0.09 hectares, including the helipad and light, and sell the rest to the province for $167,500.

Originally the land was offered for $1, but that required the province to use it for a public purpose. Had the province not engaged Ottawa on the offer, the land would have gone to public sale.

An order in council approving that negotiation was passed in August (unlike minute letters, orders in council are posted online). Rankin said he sees the company's proposal as a good opportunity and said any kind of development would still have to respect the applicable environmental regulations.

"I see golf courses coexisting with opportunities for protecting the environment," he said.

Miller, who said such a development would irreparably alter Owls Head, disagrees.

There remain about 90 properties with pending protected status from the Parks and Protected Areas Plan of 2013 and Miller has repeatedly called on the province to confer legal protection to all of them, a move that would result in about 14 per cent of Nova Scotia's land being protected.

"The government has been dragging its feet and this is exactly the problem," he said."The longer it goes before the legal designation is applied, the more and more likely that it's going to get chipped away and that one site here will get tossed [and] another site will get tossed.

"Some economic opportunity of this or that will come along and before you know it the entire plan is undermined."

A spokesperson for the Lands and Forestry Department said officials are unaware of any other negotiations or requests from private parties for land on the Parks and Protected Areas list that hasn't been approved yet for protection.

MORE TOP STORIES

Link:

N.S. won't protect land with 'globally rare' ecosystem that company eyes for golf resort - CBC.ca

OPPO announces three initiatives to co-build new intelligent service ecosystem with developers & partners – United News of India

New Delhi, Dec 19 (UNI) Chinese smartphone manufacturer OPPO on Thursday kicked off its 2019 OPPO Developer Conference (ODC) in Beijing under the theme of "Innovation and Intelligence", unveiling a range of initiatives to co-build a new intelligent service ecosystem with developers and partners.

In order to consolidate its global synergies, the Chinese phone maker announced the three new initiatives which include the enhanced developer support program Gravity Plan 2.0, for which OPPO will allocate RMB 1 billion (about USD $143 million) in 2020, the Five System-level Capability Exposure Engines and the IoT Enablement Plan.

In the past year, OPPO has made remarkable progress in building a new ecosystem, boasting more than 320 million monthly active users globally on its ColorOS operating system and accumulating a massive number of users across its applications, services, and content ecosystem. To date, more than 120,000 developers have joined the OPPO open, the platform where its open capability service has been used more than 3 billion times per day.

RMB 1 Billion towards Gravity Plan 2.0 in efforts to connect developers and users.

At the 2018 Developer Conference, OPPO officially launched the "Gravity Plan", an RMB 1 billion program to support outstanding developers worldwide. Since then, the plan has provided resources for more than 2,000 applications, resulting in 9.2 billion impressions and more than 180 million downloads.

Henry Duan, Vice President, Internet Services, OPPO, said in his keynote speech at the event that with the help of ColorOS, which is available in more than 140 countries and regions around the world, the Gravity Plan 2.0 will build on last years plan to provide sustained and all-round support to partners in the four major fields of applications, services, content and going global.

Enabling Multi-scenario Convergent Experiences through Launching Five System-level Capability Exposure Engines.

In order to enable developers to connect to OPPOs system-level capability more smoothly and quickly, and continue to bring better user experience to users, OPPO also launched Five System-level Capability Exposure Engines officially, which include Hyper Boost, Link Boost, CameraUnit, MediaUnit, and ARUnit capabilities. In this way, OPPO will improve user experience together with developers and partners.

Andy Wu, Vice President, Software Engineering, OPPO, said in a keynote speech that, With the launch of the Five System-level Capability Exposure Engines, OPPO will help developers leverage creativity, explore scenarios, maximize value and build a world of intelligent connectivity in which reality and the virtual realm integrate.

IoT Enablement Plan and HeyThings IoT service platform to accelerate the convergence of technology and service

Anticipating the integration and convergence of things to be the future, OPPO announced that it will launch the IoT Enablement Plan, a capacity opening program aimed at opening OPPOs HeyThings IoT protocol, HeyThings IoT service platform, and audio connectivity protocol for IoT partners. The newly-upgraded HeyThings IoT service platform is expected to be deployed through OPPOs open platform in Q1, 2020 while the first phase of the audio connectivity protocol is expected to be in service in June 2020.

Bobee Liu, Vice President, Intelligent Mobile Devices, OPPO, also revealed that OPPO will soon debut its first smartwatch, OPPO Watch, as well as a health platform, which will be positioned as a strategic device that OPPO will use to create a robust ecosystem with its partners. Tony Chen, Founder and CEO said on this years OPPO INNO DAY that OPPO has been more than just a phone maker from the outset. OPPO plans to invest RMB 50 billion (USD 7 billion) into R&D spending in the next 3 years to develop core technologies in hardware, software and system in addition to 5G, AI, AR, big data and other frontier technologies. By actively collaborating with partners across the industry for a future of shared success, OPPO is forging a new ecosystem of smart services in the era of the intelligent connectivity.UNI GK SHK1737

Continued here:

OPPO announces three initiatives to co-build new intelligent service ecosystem with developers & partners - United News of India

Ecosystem | Definition of Ecosystem by Merriam-Webster

To save this word, you'll need to log in.

1 : the complex of a community of organisms and its environment functioning as an ecological unit That influx of fresh water alters the ocean's salinity near the seafloor, a factor that influences the makeup of the ecosystems in those places. Sid Perkins Global warming, if it proceeds as many scientists predict, threatens to undo decades of conservation work and could mean the destruction of the monarch butterfly, the edelweiss, the polar bear and innumerable other species living in fragile ecosystems, an emerging body of scientific evidence suggests. William K. Stevens

2 : something (such as a network of businesses) considered to resemble an ecological ecosystem especially because of its complex interdependent parts Newspaper layoffs have ripple effects for the entire local news ecosystem because, as the Congressional Research Service noted, television, radio and online outlets often "piggyback on reporting done by much larger newspaper staffs." David Sirota Lots of Walmart customers are underserved by banks and other financial institutions, [Daniel] Eckert says; the company's experiments with finance-related products and services help customers "not only save money but also have access to a financial ecosystem they were crowded out from." Rob Walker

See the original post:

Ecosystem | Definition of Ecosystem by Merriam-Webster

Has Arm Discovered the Ecosystem Keys? – The Next Platform

Arm server development is a reality and a growing one at that. Not just from a performance point of view but also, perhaps more important, from an ecosystem view.

Be it the Marvell ThunderX2 processor or the Ampere eMAG Skylark processor, the hyperscale, cloud, enterprise ecosystems are willing to adopt these new processors to further improve their TCO or dollars/core.

The all-important ecosystem is catching up with Arm, which is key to the momentum necessary to make the Arm servers a sustainable reality. With AWS launching their version of Arm instances i.e. Graviton processors, theres the much needed push to make the software ecosystem more widely acceptable in the industry. Not just that, AWS even announced bare-metal offerinings for EC2 A1 instances.

Slowly but steadily, Arm has also made a mark for itself in high performance computing, something we expect to see in full force at this years Supercomputing Conference. Arm has the most traction in terms of deployments and software development in HPC in the United States, Europe and Japan with each region leading the way along different trajectories to deploy systems based on the Arm architecture for their supercomputers.

All of this has taken time and extended development, of course. The first wave of Arm based servers came in 2010 until 2014 and were more experimental in nature than real production systems.

The first 64-bit Arm design i.e. the ARMv8-A was introduced in 2011 and since then the Arm server ecosystem have seen lots of ups and downs. ZTSystems, in November 2010 had launched a 1U Data Center Arm server based on Cortex-A9 cores (32-bit) which was supposed to be energy efficient and a denser solution compared to Intel Servers. Then came Calxeda with their version of 32-bit Arm servers i.e. the EnergyCore-ECX-1000 which did not see adoption and Calxeda eventually went defunct in 2013. In 2011 AppliedMicro launched the X-Gene 1 processor followed by X-Gene 2 in 2014. Samsung, Cavium (now Marvell) and AMD came up with their versions of Arm processors which tried to penetrate the server market but could not generate tangible interest among the end-users to adopt these technologies.

Arm servers have undergone a transformation in terms of development and early signs of this were seen in a semi-secret project within Broadcom which was taking shape in the form of Project Vulcan. The idea was to develop a world class 64-bit serious Arm server to take on Intel in the HPC and cloud market.

In late 2016, when Avago gave up on Broadcoms ambitions to develop a first class Arm server, Cavium jumped in and brought the Vulcan IP and Team on-board and fully funded the Vulcan project, re-christened as Cavium ThunderX2 now, Marvell ThunderX2. In more ways than one, the ThunderX2 is a serious contender to Intel and AMD in the HPC, hyperscale and cloud businesses.

To make things better for the Arm ecosystem, in 2017, a brand new company, Ampere Computing bought the X-Gene assets and re-introduced the X-Gene processor as the Ampere eMAG processor. It needs to be mentioned that Qualcomm tried its hand at building a true Data Center Arm Server Centriq based on the Falkor Architecture and given Qualcomms standing, with time, it could have made their data center server project a success. However, for reasons unknown to many, they chose to significantly disinvest and many personnel from Qualcomms Centriq project were hired by Ampere Computing in Raleigh. Huawei has a very compelling Arm Server offering in the Kunpeng 920, which is a 7-nm, 64 core CPU.

Figure 1: Diverse Arm architectures (source)

The question many have is whether the Arm server ecosystem is mature enough to be excited about?

The ecosystem has come a long way to become a stable one. However, it has many miles to go to reach the same level as x86. Given this momentum, it would not be surprising if the likes of Google, Facebook, Tencent etc. are actively experimenting with Arm platforms. Amazon and Microsoft have already invested in Arm platforms in their respective clouds i.e. AWS & Azure.

Figure 2: Commits to Linux GitHub repository for x86 vs. arm64 as of 13th November, 2019

The contributions towards enabling aarch64 for Linux operating system have steadily increased since 2012 while the growth rate for x86 has not been as consistent. These are good indications that the Arm ecosystem is here to stay and growing.

An ongoing debate among software engineers is whether to implement a business logic in a monolithic architecture or take the same logic and break it down into multiple pieces. There is a growing trend of organizations moving to a Microservices architecture for various reasons be it unit testing, ease of deployment, server performance among many others. Also, microservices based architecture are relatively easy to scale compared to a monolith. Linaro, Arm and Arm Server Manufacturers are leading this charge. Also, Packet is providing the developer community a platform to develop and sustain the ecosystem.

If theres one area where Arm servers have taken the biggest strides, it is definitely be High Performance Computing (HPC). The Arm ecosystem for HPC is also the most developed compared to Arms progress in cloud datacenters.

The momentum for Arm in HPC was driven by many centers, but Dr. Simon McIntosh-Smith and the University of Bristol and Cray hosting the 1st Isambard Hackathon to optimize HPC applications for ThunderX2 based servers back in November 2017 at Bristol. This was promptly followed up by a 2nd Isambard Hackathon in March 2018.

Most of the HPC applications compile and run out of the box for Arm based servers with Arm compilers, GCC, OpenMPI, OpenMP support.

I participated in both representing Cavium Inc, assisting developers, architects and engineers optimize their codes/applications for ThunderX2 Processors. Collectively, we optimized key HPC applications like NAMD, UM-NEMO, OpenFOAM, NWCHEM, CASTEP, etc. and compared to Intel CPU Architectures like Broadwell and Skylake. Prof Smith and team did a detailed study identifying the opportunities and benefits of Arm Servers with regards to the incumbent Intel servers with compelling performance per dollar for the Arm-based servers.

Figure 3: Cray-Isambard performance comparison on mini-apps

Figure 4: Cray-Isambard performance comparison on key Archer applications

Figure 5: Cavium Inc. published HPC Performance comparison vs. Intel Skylake CPUs (2017)

This was a significant movement that Arm servers needed in the HPC space. The two Isambard hackathons also fast-tracked the Arm HPC development with Arm optimizing their compilers as well as Math libraries in collaboration with Arm server manufacturers like Cavium Inc (now Marvell Semiconductors). There is tremendous movement in the Arm HPC Performance Libraries optimization world. Arm has invested in optimizing GEMM, SVE, spMM, spMV and FFT libraries in collaboration with developers and Silicon manufacturers like Marvell. The Arm Allinea Studio has successfully established itself as a go-to tool for Arm server Workload Analysis, similar to what VTune would be for Intel.

Another major milestone was the Vanguard Astra Arm based supercomputer at Sandia National Laboratories powered by DoE, Cavium and HPE. This is the first Arm based supercomputer to make the TOP500 list at 156th position as of June 2019 and 198th rank in the November 2019 rankings. The building blocks are HPE Apollo 70 platforms, Marvell ThunderX2 CPUs with 4xEDR Infiniband interconnect. The Astra Supercomputer is made up of 2592 compute servers i.e. 145k cores and 663 TB memory. US DoE is making a concerted effort to invest in diverse as well as future proof technologies such as Arm, in its path towards achieving exascale computing.

Figure 6: Astra, the Arm based supercomputer debuted on the TOP500 list in November 2018

Europe and Asia are taking huge strides in deploying Arm based clusters and systems for HPC and Research. Be it Monte-Carlo, Isambard or CINECA-E4 projects in Europe or Japans Arm based Fugaku supercomputer, its just the beginning of a new era of Arm in HPC. Cray is betting big with the A64FX Arm chip built by Fujitsu. The A64FX prototype is number one on the Green500 list and 160th on the Top500 list..

HPC workloads tend to be highly parallelizable in nature, and Arm CPUs provide an opportunity to leverage lots of cores at reasonable price points. Further, having competition in the CPU market benefits all buyers, not just HPC shops, to negotiate the best resources for their workloads.

Marvell is a pioneer in more ways than one in introducing the Arm server ecosystem to the hyperscale world with Marvell and Microsoft partnering on ThunderX2 platforms for Azure. Oracle has invested $40 Million in Ampere Computing, which is home to the ARMv8 eMAG processor. Oracle also has plans to massively expand their datacenter footprint in the coming months and this investment in Ampere could mean potential deployment of eMAG processors in Oracle Data Centers.

In the recent past, theres been a slew of announcements regarding enhancements to the Arm ecosystem. VMware announced 64-bit support Arm Support. In an official announcement, DDN announced professional support for Lustre on Arm servers in 2018 In mid 2019 at ISC, AMI announced firmware support for the Marvell ThunderX2 Arm based servers in March 2019.

NVIDIA announced CUDA support for Arm at ISC19 and backed it up with a major announcement of introducing a reference design to enable organizations to build GPU-accelerated Arm based servers, which is a big shift towards enabling Arm to be successful in the HPC and accelerated computing segment. Imagine a system with power efficient Arm based CPUs with GPUs for training and AI ASICs for inference. Machine Learning & Artificial Intelligence pose interesting opportunities & the collaboration with NVIDIA will enable this segment for Arm based solutions.

Like Intel, AMD and Arm, Ampere Computing too has created a developer program for developers to build and expand their Cloud Ecosystem. This will enable further and faster integration of Arm servers in the hyperscale and datacenter world in a much more open and collaborative way.

While the ecosystem still needs more time to grow and mature, it is steadily moving towards that nirvana of It just works. With the emergence of Arm in the computer architecture world along with RISC-V and many other semiconductor start-ups, its only a matter of time until aarch64 is the new normal like x86. That is what the community is all striving towards.

Once the developers are convinced that their software stack just works on Arm Servers, it would be a big win for the Arm Server ecosystem, and I for one am willing to make the bold claim that for many workloads especially HPC It just works

About the Author

Indraneil Gokhale is a Performance Engineer and leads the Hardware Engineering team at Box Inc. Indraneil has previously worked at Cavium (now Marvell), Uber and Intel. Indraneil has experience in optimizing HPC applications and workloads for x86 and aarch64 architectures. He has published white papers, book chapters on optimizing the Weather Research and Forecasting (WRF) application. Indraneil holds a Masters Degree in Electrical Engineering from Auburn University, USA and a Bachelors Degree in EEE from Jawaharlal Nehru Technological University, Hyderabad, India.

Read more:

Has Arm Discovered the Ecosystem Keys? - The Next Platform