NASA needs a cheaper, faster way to bring Mars dirt back to Earth – The Verge

Budget constraints have NASA looking for a faster and cheaper method to bring samples from Mars surface back to Earth. In a teleconference on Monday, NASA Administrator Bill Nelson said that an independent review concluded that the agencys current plan to bring the first samples collected by the Mars rover Perseverance could cost up to $11 billion and would likely not be achievable until 2040. The fiscal 2025 budget for the space agency, as well as additional anticipated budget cuts, are behind how slowly the current plan is being executed.

That is unacceptable to wait that long, Nelson said about the mission to return samples of dust and rocks from Mars to Earth. Its the decade of the 2040s that were going to be landing astronauts on Mars.

NASA is planning to solicit ideas from its various centers and the Jet Propulsion Laboratory for a quicker, cheaper return mission. Nelson said the agency is aiming for a budget of under $7 billion and is hoping to bring the samples back in the 2030s.

The independent review, conducted last September, raised numerous concerns over the feasibility of NASAs Mars Sample Return Mission. NASA had originally estimated that the return missions launch would take place in 2027 or 2028, but the independent review concluded that this would be impossible due to technical issues, risks, and performance-to-date.

In an X post on Monday, SpaceX founder Elon Musk wrote that the company will be responding to NASAs solicitation for alternatives and that the companys Starship rocket has the potential to return serious tonnage in less than five years.

Starship has run into its own delays and challenges. The most recent launch of a prototype in March was successful, but SpaceX lost contact with the rocket just as it reentered the Earths atmosphere.

See the original post:

NASA needs a cheaper, faster way to bring Mars dirt back to Earth - The Verge

Pioneering Emotional Intelligence in Robotics: The Rise of Emo – yTech

In a breakthrough for robotics and artificial intelligence (AI), a robot named Emo stands as a testament to technological ingenuity, possessing the capability to learn and replicate human emotional expressions. This development marks a significant stride in narrowing the emotional divide between humans and machines, potentially reshaping the way we interact with robots in a multitude of sectors.

Core Innovation Behind Emos Emotional Acuity Emos core innovation lies in its dual neural network architecture, which empowers the robot with unprecedented emotional intelligence. By utilizing advanced cameras and motor systems, Emo can observe and assimilate human expressions. Over time, its capacity to respond in contextually relevant ways improves, making human-robot interactions increasingly natural and seamless.

Professor Hod Lipson and his team are the visionaries behind Emos conceptualization and realization. Their work paves the way for a future where robots can forge emotional bonds with humans, setting a new benchmark in social robotics.

Potential for Transformative Impact Across Industries The ripple effect of Emos introduction is vast, with implications for customer service, therapy, elder care, and education. It foretells significant growth within the social robotics market, with affordable manufacturing techniques on the horizon and analysts predicting robust market development bolstered by the integration of empathetic robots in everyday life.

Navigating the Ethical Considerations of Advanced Robotics Notwithstanding the advancements and promises of Emos technology, ethical questions loom. Issues surrounding emotional authenticity, privacy, and employment disruptions accentuate the need for conscientious deployment of such robots. This underscores the importance of engaging with ethics-focused organizations like IEEE and ACM, which strive to establish standards that balance technological progress with societal well-being.

In summary, Emo represents a fusion of AI and emotional perception, potentially revolutionizing human-robot interaction and industry practices. Its advent warrants thoughtful consideration of the ethical landscape as we embrace the age of emotionally intelligent machines. The robotic companions evolution and the industrys path forward will be characterized by ethical vigilance, research brilliance, and insightful analysis, jointly shaping the role of robotics in our future.

Expanding the Market Forecast for Emotionally Intelligent Robots The global market for social and emotional robotics is expected to experience substantial growth over the coming years. According to a report by MarketsandMarkets, the social robot market, in particular, is expected to rise from USD 918 million in the current scenarios to over USD 3,900 million by the next decade, expanding at a CAGR of 14.5% during the forecast period. This growth is fueled by increasing adoption in sectors such as personal assistance, education, and healthcare, where they can perform tasks ranging from companionship to assisting with cognitive therapy and rehabilitation.

The emergence of robots like Emo will spur further research and development, reducing costs and enhancing functionalities. This will likely attract investment and increase the accessibility of these robots, thus making them more commonplace in both consumer and commercial environments.

Challenges and Controversies Within the Robotics Industry Despite these promising market forecasts, the robotics industry faces challenges and controversies that could impact the emotional intelligence sector. One of the primary concerns is job displacement, as robots become capable of performing tasks typically reserved for human workers. This could lead to significant shifts in the labor market and necessitate retraining for those whose jobs are affected.

Another key consideration is data privacy and security, especially with robots that can collect and analyze personal emotional data. Ensuring that this information is used responsibly and securely is paramount to maintaining public trust.

For research, development, and the establishment of standards in robotics, resources can be found through organizations such as IEEE and ACM.

Summary and Industry Outlook In conclusion, Emo exemplifies the potential for emotion recognition in robotics to drive innovation across various sectors. The social and emotional robot industry is anticipated to flourish, bringing about advancements in how these machines are integrated into our daily lives. As the industry progresses, it will be essential to monitor market dynamics, foster ethical practices, and encourage responsible innovation, thereby ensuring that the evolution of robots like Emo contributes positively to society.

The success of products like Emo and the industrys trajectory will heavily rely on striking a balance between innovation and the humane and ethical application of technology. Thought leaders, developers, and policymakers will need to collaborate to navigate these challenges successfully. The trends in the robotics industry point towards a future where emotionally intelligent machines become an integral part of the fabric of society, enhancing human life while addressing the ethical implications of such profound technological integration.

Leokadia Gogulska is an emerging figure in the field of environmental technology, known for her groundbreaking work in developing sustainable urban infrastructure solutions. Her research focuses on integrating green technologies in urban planning, aiming to reduce environmental impact while enhancing livability in cities. Gogulskas innovative approaches to renewable energy usage, waste management, and eco-friendly transportation systems have garnered attention for their practicality and effectiveness. Her contributions are increasingly influential in shaping policies and practices towards more sustainable and resilient urban environments.

Continued here:

Pioneering Emotional Intelligence in Robotics: The Rise of Emo - yTech

There Might Be No ChatGPT-like Apple Chatbot in iOS 18 – The Mac Observer

The recent months in the tech scene have been all about artificial intelligence and its impact, but one company that has been late to the party is Apple. Apple first hinted about inhouse-AI development during a recent earnings call, which followed the earlier reports of the company reaching out to major publishers to use their data to train its AIs dataset, canceling the Apple Car project and shifting the team to AI. However, according to Bloombergs Mark Gurman, Apple might not debut a ChatGPT-like chatbot, at all. Instead, the company is exploring deals with established tech giants such as Chinas Baidu, OpenAI, and Google about potential partnerships.

That said, Apple might instead focus on licensing already-established chatbots like Googles Gemini (fka Bard) or OpenAIs ChatGPT. They might delay all plans to release an Apple chatbot, internally dubbed Ajax GPT.

Nevertheless, Mark Gurman believes AI will remain in the shows spotlight at the upcoming Worldwide Developers Conference (WWDC), slated for June 10-14, 2024 where we expect to see iOS 18, iPadOS 18, watchOS 11, tvOS 18, macOS 15, and visionOS 2. Although he doesnt delve into details of the upcoming AI feature, he mentions the companys plans to unveil new AI features, which could serve as the backbone of the next iOS 18. This suggests that even if Apple doesnt intend to bring a native AI chatbot to the devices, we might see a popular chatbot pre-installed on the phones or supported natively by the device. For reference, a London-based consumer tech firm, Nothing, recently partnered with the Perplexity AI search engine to power up its latest release, Phone 2(a), and Apple might have similar plans, but with generative AI giants.

CEO Tim Cook recently spoke to investors that the company will disclose its AI plans to the public later this year. Despite Apples overall reticence on the topic, Cook has been notably vocal about the potential of AI, particularly generative AI.

More importantly, according to previous reports, he has indicated that generative AI will improve Siris ability to respond to more complex queries and enable the Messages app to complete sentences automatically. Furthermore, other Apple apps such as Apple Music, Shortcuts, Pages, Numbers, and Keynote are expected to integrate generative AI functionality.

Source

Read the rest here:

There Might Be No ChatGPT-like Apple Chatbot in iOS 18 - The Mac Observer

Ex VR/AR lead at Unity joins new spatial computing cloud platform to enable the open metaverse at scale, AI, Web3 – Cointelegraph

The metaverse is reshaping the digital world and entertainment landscape. Ozones platform empowers businesses to create, launch and profit from various 3D projects, ranging from simple galleries or meetup spaces to AAA games and complex 3D simulations, transforming how we engage with immersive content in the spatial computing era.

Apples Vision OS launch is catalyzing mainstream adoption of interactive spatial content, opening new horizons for businesses. 95% of business leaders anticipate a positive impact from the metaverse within the next five to ten years, potentially establishing a $5 trillion market by 2030.

Ozone cloud platform has the potential to become the leading spatial computing cloud. Source: Ozone

The future of 3D technology seamlessly blends the virtual and physical realms using spatial computing technology. But, spatial computing can be challenging, especially when the tools are limited and the methods for creating 3D experiences are outdated.

A well-known venture capital firm, a16z, recently pointed out that its time to change how game engines are used for spatial computing, describing the future of 3D engines as a cloud-based 3D creation Engine and this is exactly what the Ozone platform is.

The Ozone platform is a robust cloud computing cloud for 3D applications. Source: Ozone

The platforms OZONE token is an innovative implementation of crypto at a software-as-a-service (SaaS) platform level. You can think of the OZONE token as the core platform token that will unlock higher levels of spatial and AI computing over time, fully deployed and interoperating throughout worlds powered by our cloud.

Ozone is fully multichain and cross-chain, meaning it supports all wallets, blockchains, NFT collections and cryptocurrencies and already integrated several in the web studio builder with full interoperability across spatial experiences said Jay Essadki, executive director for Ozone.

Ozone Studios already integrated and validated spatial computing cross-chain interoperability. Source: Ozone Studio

He added, You can think of the Ozone composable spatial computing cloud as an operating system, or as a development environment. It continuously evolves by integrating new technologies and services.

The OZONE token, positioned as the currency of choice, offers not just discounts and commercial benefits but also, through the integration with platform oracles and cross-chain listings, enables the first comprehensive horizontally and vertically integrated Web3 ecosystem for the metaverse and spatial computing era.

Ozone eliminates technical restrictions and makes spatial computing, Web3 and AI strategies accessible to organizations looking to explore the potential of the metaverse with almost no technical overhead or debt.

Ozone is coming out of stealth with a cloud infrastructure supported by AI and Web3 microservices and is expanding its executive, engineering and advisory teams as it raises more capital in view to replace legacy game engines such as Unreal or Unity.

At the same time, Ozone provides full support for those engines created assets to be deployed on the Ozone platform across Web2 and Web3 alike.

Also Ozone is on a roll of enterprise and government discussions and has been establishing and closing enterprise and government customer relationships in view of initial cloud infrastructure deployment.

Ozone welcomes new advisoers as the platform comes out of stealth.

Ozones new 2024 advisors to make the open metaverse happen:

Ozone will finalize a full game engine based on fully integrated micro-templates that will make the build and deployment of all games and 3D spatial computing as simple as clicking a few buttons, and it is already working.

The upcoming features on the Ozone 3D Web Studio. Source: Ozone

Ozone is announcing a new suite of templatized games. With multi-AI integration, three completed games (Quest, Hide and Seek and RPG, coming in 2024) and more are underway.

It opens up the way to building interactive 3D experiences in a new way.

Ozone helps companies to build and share 3D experiences. Source: Ozone

At the heart of Ozone is the innovative Studio 3D development platform, complemented by a marketplace infrastructure to support e-commerce and the economy.

Ozones SaaS platform empowers businesses to create, deploy and monetize Spatial Computing experiences at scale for Web3 or traditional e-commerce applications. The platforms features, including social infrastructure, AI integration and gamification elements, enhance the interactive aspect of 3D experiences, digital twins and spatial data automation, while providing full interoperability and portability of content and data across experiences and across devices

Ozones vision of becoming the industry standard for interactive 3D development, with compatibility across devices and accessibility from any device, positions it as a catalyst for innovation in media and entertainment. Ozone is set to play a key role in shaping the future of immersive spatial web experiences.

Ozone has secured investments from prominent Web3 VC funds and is opening its first-ever VC equity financing round.

Disclaimer. Cointelegraph does not endorse any content or product on this page. While we aim at providing you with all important information that we could obtain in this sponsored article, readers should do their own research before taking any actions related to the company and carry full responsibility for their decisions, nor can this article be considered as investment advice.

View post:

Ex VR/AR lead at Unity joins new spatial computing cloud platform to enable the open metaverse at scale, AI, Web3 - Cointelegraph