China sends human bone cells to Tiangong space station (video) – Space.com

A Chinese space freighter has delivered human bone cells to the Tiangong space station for on-orbit research.

The Tianzhou 7 cargo spacecraft launched on a Long March 7 rocket from Wenchang Satellite Launch Center on Jan. 17, reaching the Tiangong space station just over three hours later. Among its cargo of around 12,350 pounds (5,600 kilograms) were more than 60 experiments, including human bone cells for research into bone mineral density.

The cells grow quickly, meaning the experiment needed to be installed just hours ahead of launch to ensure optimal cell activity before they fill the space available for them. Their growth will be closely monitored and data transmitted back to Earth for analysis.

Related: The latest news about China's space program

"Our experimental equipment in space will guarantee the physical and chemical conditions for cell culture like the replacement of nutrient fluid and the gas for bone cells, Shang Peng, a professor at the Northwestern Polytechnical University, told CCTV.

It also carries a fluorescence microscope and ordinary light microscope to monitor the growth of the cells. Some of this information will be recorded and transmitted to the Earth for analysis in real time and in the future."

Astronauts in orbit are instructed to exercise for hours each day to help prevent the bone loss associated with living long-term in a microgravity environment.

This can exercise their skeletal muscles and also prevent their bone loss effectively," Shang said.

Such research, which is also being conducted cooperatively with scientists from other countries, could lead to better ways to tackle bone loss problems experienced on Earth as well as in space.

"We will do further projects in the operation phase of China's space station. Based on these, we will develop relevant medicines and test them. They will not only help taikonauts in space but also humans on the Earth, particularly the elderly. This will be very meaningful," said Wang Jinfu, professor with the Zhejiang University in east China.

China's Tianzhou 1 mission, launched in 2017 as a prototype space station cargo and refueling mission to dock with the Tiangong 1 space lab, carried stem cells. That was a rare opportunity for such tests.

Having the Tiangong space station in its operational phase with the Tianhe core module recently passing 1,000 days in orbit means more and regular opportunities for China's science community to carry out scientific experiments in space.

See the rest here:

China sends human bone cells to Tiangong space station (video) - Space.com

Syracuse native Jeanette Epps to join NASA’s Crew-8 Mission to the International Space Station – Democrat & Chronicle

democratandchronicle.com wants to ensure the best experience for all of our readers, so we built our site to take advantage of the latest technology, making it faster and easier to use.

Unfortunately, your browser is not supported. Please download one of these browsers for the best experience on democratandchronicle.com

See the article here:

Syracuse native Jeanette Epps to join NASA's Crew-8 Mission to the International Space Station - Democrat & Chronicle

Ax-3 astronaut snaps dizzying photo of ISS’s jam-packed interior – Space.com

A new view from inside the International Space Station captures a dizzying number of experiments underway in orbit.

European Space Agency (ESA) project astronaut Marcus Wandt recently shared a photo he took while floating in the microgravity environment of the orbiting lab's Destiny module. Destiny is the International Space Station's primary research laboratory and is therefore home to a wide range of experiments and studies.

In the photo, which Wandt shared on X (formerly Twitter) on Jan. 25, the walls of the Destiny module are lined with various pieces of equipment and cords strung about to keep all of the tools tethered. Wandt's legs and feet can also be seen floating in the foreground of the photo due to the weightlessness astronauts experience inside the spacecraft.

Related: International Space Station at 20: A photo tour

The Destiny module has 24 equipment racks, which support various studies related to health, safety and humans' quality of life. The space station offers researchers a unique opportunity to conduct experiments in the absence of gravity, thus allowing them to better understand humans and the world in which we live.

"An astronaut's perspective," Wandt wrote in the X post. "How does this photo make you feel: relaxed, stressed, giddy or wanting to rearrange everything?"

Wandt launched to the space station on Jan. 18 as part of Axiom Space's Mission 3 (Ax-3). Joined by mission specialist Alper Gezeravc of Turkey, commander and former NASA astronaut Michael Lpez-Alegra (who has dual U.S. and Spanish citizenship), and mission pilot and Italian Air Force Col. Walter Villadei, Ax-3 carries Axiom's first all-European crew.

The four Ax-3 astronauts are living and working in orbit for up to two weeks. They are tasked with over 30 experiments spanning various fields in science and technology aimed at propelling advancements in human spaceflight and contributing to enhancing life on Earth.

While some may see Wandt's photo and think the inside of the module appears a bit cluttered without the force of gravity to hold all of the equipment neatly in place, others may feel relaxed by the idea of floating weightless through space. However, despite the apparent disorganization, astronauts are trained to maintain a high standard of cleanliness, to ensure the safety and functionality of the space station.

So, the question remains: How does this photo make you feel?

Continued here:

Ax-3 astronaut snaps dizzying photo of ISS's jam-packed interior - Space.com

First metal 3D printer heading to International Space Station – The Engineer

The logistical burden of resupplying future Moon bases with tools or load bearing parts could be alleviated if astronauts could manufacture such items with a metal 3D printer.

To that end, the first metal 3D printer launched on January 30, 2024 onboard NASAs mission NG-20, which is heading to the International Space Station on a resupply mission.

In the coming days the printer will be set-up in the Columbus module the science lab onboard the ISS - by astronaut Andreas Mogensen and operated to 3D print the first metallic part in space.

The printer was developed by Airbus, AddUp, Cranfield University and Highftech Engineering under a European Space Agency (ESA) programme.

In a statement, Gwenalle Aridon, Airbus Space Assembly lead engineer, said: Astronauts will be able to directly manufacture tools such as wrenches or mounting interfaces that could connect several parts together. The flexibility and rapid availability of 3D printing will greatly improve astronauts autonomy.

Sbastien Girault, metal 3D printer system engineer at Airbus, said the printer is the size of a washing machine and can print parts that are 9cm high and 5cm wide.

As well as overcoming the challenge of size, the printer will sit in a sealed metal box to protect against the aggressive printing environment caused by the laser and the heat it generates.

Gravity management is also key, which is why we chose wire-based printing technology, said Girault.

Furthermore, fumes that are emitted will be addressed by filters and captured inside the machine so that they do not contaminate the air inside the ISS.

Two printers will be used for this experiment: the flight model inside the ISS; and the engineering model on Earth. The astronauts will print four samples in space, which will be sent back to Earth for analysis. The same specimens will be manufactured using the engineering model printer.

In order to evaluate the effects of microgravity, ESA and Danish Technical University will perform mechanical strength and bending tests and microstructural analysis on the parts made in space and compare them to the other specimens, said Girault.

According to Airbus, there are several plastic 3D printers on board the ISS and Astronauts have used them to replace or repair plastic parts that would have taken months to arrive if built and transported from Earth.

This logistical constraint will intensify on future Moon and Mars stations; the raw material will still need to be launched, but printing the part is still more efficient than transporting it to its destination.

Increasing the level of maturity and automation of additive manufacturing in space could be a game changer for supporting life beyond Earth, said Aridon. Thinking beyond the ISS, the applications could be amazing. Imagine a metal printer using transformed regolith or recycled materials to build a lunar base.

See the original post:

First metal 3D printer heading to International Space Station - The Engineer

SpaceX launches UF/IFAS microbiology experiment to ISS – University of Florida

eating staph infections can be tricky in the best of times. But what happens if you get infected while in space?

Thats the scenario Kelly Rice, associate professor at the University of Florida Institute of Food and Agricultural Sciences, hopes to help understand. Rices experiment will be launched today, Jan. 30, to the International Space Station on a SpaceX rocket.

Staph, orStaphylococcus aureus, is a type of bacteria found in the nose or on the skin of up to 30% of humans, but under certain conditions, it can thwart a human bodys physical and immune defenses and cause severe infection. Staph infections can be particularly troublesome for people in close quarters, such as astronauts.

A previous study done by Rice and her colleagues found that the bacteria had the potential to be more dangerous to astronauts while in microgravity.

The current experiment will include growing the bacteria in enclosed canisters to better understand how microgravity affects expression of disease-spreading properties, how the bacteria grows and other factors.

We are grateful to NASA for the opportunity to study this bacteria, and the information gained may apply to other bacteria as well, Rice said. We hope that these results will help guide strategies to maintain astronaut health during long-term space flight missions.

This study was funded by a grant from the NASA Biological Sciences Divisions Space Biology Program.

Meredith Bauer January 30, 2024

See the article here:

SpaceX launches UF/IFAS microbiology experiment to ISS - University of Florida

Northrop Grumman launches science investigations, supplies to space station – Vero News

KENNEDY SPACE CENTER Northrop Grumman launched avariety of scientific experiments and equipment including a surgical robot and a 3D cartilage cell culture to the International Space Station on Tuesday. Skywatchers on Floridas east coast including portions of the Treasure Coast watched the launch under clear skies, followed by a booster landing accompanied with a thunderous-like sonic boom.

The liftoff was part of Northrop Grummans 20th Commercial Resupply Services mission, NASA officials said. Viewers on the Space and Treasure coasts can soon expect to see more launches, including the SpaceX Crew 8 launch slated for Feb. 22.

Northrop Grumman officials named the recent mission after NASA Astronaut Patricia Patty Hilliard Robertson, a medical doctor, pilot and space medicine fellow who died in a plane crash in 2001.

Northrop Grumman, SpaceX and NASA coordinated the event. The Cygnus cargo spacecraft manufactured by Northrop Grumman launched at 12:07 p.m. atop a Falcon 9 rocket from Cape Canaveral Space Launch Complex 40.

Cygnus will reach the space station in two days, NASA officials said. This marks the seventh launch for SpaceX this year.

The cargo is carrying more than 8,200 pounds of supplies to the space station. The spacecraft will deliver the first surgical robot on the space station, an orbit reentry platform that collects thermal protection systems data and a 3D cartilage cell culture that will help astronauts keep healthy cartilage in microgravity, NASA officials said.

The cargo also has a metal 3D printer that will test the capability for printing small metal parts. The MSTIC facility Manufacturing of Semiconductors and Thin-film Integrated Coatings is another science experiment headed to space.

The facility developed by Redwire Space based in Jacksonville has a manufacturing capability to make high-quality, lower cost semiconductor chips at a fast rate, NASA officials said. The semiconductors are a critical component that function many of the tools people use every day including smartphones, computers, vehicles and medical devices, Redwire Space officials said.

MSTIC also has an autonomous manufacturing capability that can replace several machines and processes that are required to create semiconductor devices.

The true potential of manufacturing in space lies in the unique conditions of space. Producing films in orbit could lead to significantly improved crystal structures, minimizing irregularities often seen in earth-based manufacturing, Tere Riley, director of marketing and communications for Redwire Space, told VeroNews. This could mean films with more uniform thickness, enhanced conductivity, and greater efficiency, ultimately boosting the performance of the devices theyre used in.

The Cygnus spacecraft will remain at the space station until July, when it will descend back to earth and burn up in the atmosphere, NASA officials said.

Here is the original post:

Northrop Grumman launches science investigations, supplies to space station - Vero News

Cygnus Soars on SpaceX Rocket to Resupply International Space Station – SciTechDaily

Northrop Grummans Cygnus spacecraft, loaded with more than 8,200 pounds of supplies, launched to the ISS on a SpaceX Falcon 9 rocket, marking the 20th resupply mission by Northrop Grumman for NASA. Credit: SpaceX

A fresh supply of more than 8,200 pounds of scientific investigations and cargo is on its way to the International Space Station on a Northrop Grumman Cygnus resupply spacecraft after launching on a SpaceX Falcon 9 rocket at 12:07 p.m. EST Tuesday from Space Launch Complex 40 at Cape Canaveral Space Force Station in Florida.

About 15 minutes after launch, Cygnus reached its preliminary orbit. About two hours after launch, the spacecraft successfully deployed its two solar arrays.

A successful liftoff from Space Launch Complex 40 at Cape Canaveral Space Force Station in Florida as Northrop Grummans Cygnus spacecraft, atop a SpaceX Falcon 9 rocket, heads to the International Space Station for the 20th Northrop Grumman resupply mission on Tuesday, January 30, 2024. The spacecraft is expected to reach the space station on Thursday, February 1, 2024, bringing 8,200 pounds of science investigations, supplies, and equipment for the international crew. Credit: Kim Shiflett

Cygnus is scheduled to arrive at the space station around 4:15 a.m. Thursday, February 1.

NASA+, NASA Television, the NASA app, and agencys website will provide live coverage of the spacecrafts approach and arrival beginning at 2:45 a.m.

NASA astronaut Jasmin Moghbeli will capture Cygnus using the stations Canadarm2 robotic arm, and NASA astronaut Loral OHara will be acting as a backup. After capture, the spacecraft will be installed on the Unity modules Earth-facing port.

This is Northrop Grummans 20th contracted resupply mission for NASA.

Northrop Grummans Cygnus spacecraft, atop a SpaceX 9 Falcon rocket, soars from Space Launch Complex 40 at Cape Canaveral Space Force Station in Florida on Tuesday, January 30, 2024, for the 20th Northrop Grumman commercial resupply mission for NASA. The spacecraft will bring 8,200 pounds of science investigations, supplies, and equipment to the International Space Station including tests of a 3D metal printer, semiconductor manufacturing, and thermal protection systems. The Cygnus spacecraft is expected to reach the space station on Thursday, February 1, 2024, where it will remain until its expected departure in May. Credit: SpaceX

Northrop Grummans Cygnus spacecraft is an unmanned cargo spacecraft designed to transport supplies, equipment, and scientific experiments to the International Space Station (ISS). Developed as part of NASAs Commercial Orbital Transportation Services (COTS) program, Cygnus plays a crucial role in maintaining the ISSs operations and advancing space research.

The spacecraft consists of two primary components: the Service Module, which contains the spacecrafts avionics, propulsion, and power systems, and the Pressurized Cargo Module, where the cargo is stored. Once Cygnus completes its mission and is unberthed from the ISS, it safely burns up upon re-entering the Earths atmosphere. This design makes Cygnus an efficient means of not only delivering supplies but also disposing of the stations waste.

Over the years, Cygnus has been instrumental in numerous resupply missions, contributing significantly to the ongoing success and sustainability of the ISS and its missions.

View post:

Cygnus Soars on SpaceX Rocket to Resupply International Space Station - SciTechDaily

SpaceX launches new resupply mission to space station – Xinhua

The image of Northrop Grumman's Cygnus resupply spacecraft is posted on the website of NASA. (Photo credit: NASA)

It is Northrop Grumman's 20th contracted resupply mission for NASA.

LOS ANGELES, Jan. 30 (Xinhua) -- A fresh supply of more than 8,200 pounds of scientific investigations and cargo was launched to the International Space Station (ISS) on Tuesday on a Cygnus resupply spacecraft of American aerospace technology company Northrop Grumman.

The Cygnus cargo spacecraft was launched on a SpaceX Falcon 9 rocket at 12:07 p.m. Tuesday Eastern Time from Cape Canaveral Space Force Station in the U.S. state of Florida.

It is Northrop Grumman's 20th contracted resupply mission for NASA.

The Cygnus spacecraft is scheduled to remain at the ISS until May when it will depart the orbiting laboratory at which point it will harmlessly burn up in Earth's atmosphere, according to NASA.

Read more:

SpaceX launches new resupply mission to space station - Xinhua

Cygnus Flies to International Space Station – Mirage News

SpaceX

In this image from Jan. 30, 2024, an uncrewed Cygnus cargo spacecraft launches atop a SpaceX Falcon 9 rocket, starting its journey to the International Space Station. Launching from NASA's Kennedy Space Center in Florida, Cygnus carries 8,200 pounds of science investigations and cargo to support dozens of research experiments. This is Northrop Grumman's 20th cargo flight to the orbiting laboratory.

Watch NASA+ for live coverage of Cygnus's approach to the space station on Feb. 1, 2024, beginning at 2:45 a.m. EST.

Image Credit: SpaceX

See original here:

Cygnus Flies to International Space Station - Mirage News

Surgical robot built in Lincoln blasts off to International Space Station – KLKN

LINCOLN, Neb. (KLKN) A surgical robot built by a Lincoln company got a glimpse of the stars Tuesday.

The robot along with other experiments, supplies and equipment was launched into space from Cape Canaveral, Florida.

Virtual Incision, a startup company based at Nebraska Innovation Campus, created the miniaturized in vivo robotic assistant, also known as MIRA.

MIRA can perform abdominal surgeries in a minimally invasive manner, officials said.

Surgeons could also use the technology to perform procedures remotely.

NASA took an interest in the robot last year.

The robot is heading to the International Space Station, where it will help test remote surgery tasks.

Shane Farritor, Virtual Incisions co-founder, said the research will be a huge step toward what he calls telesurgery.

Well start by having the robot do a little bit by itself, but then later in the mission, were going to try and control it from Lincoln, Farritor said.

MIRA will collect data for Farritors team before it returns to Earth in the spring.

View original post here:

Surgical robot built in Lincoln blasts off to International Space Station - KLKN

NASA, SpaceX, Northrop Grumman team up on International Space Station cargo resupply launch – Florida Today

floridatoday.com wants to ensure the best experience for all of our readers, so we built our site to take advantage of the latest technology, making it faster and easier to use.

Unfortunately, your browser is not supported. Please download one of these browsers for the best experience on floridatoday.com

More:

NASA, SpaceX, Northrop Grumman team up on International Space Station cargo resupply launch - Florida Today

Cape Canaveral launch: Supplies head to ISS – WESH 2 Orlando

NASA, SpaceX, more launch supplies to International Space Station from Cape Canaveral

Updated: 12:28 PM EST Jan 30, 2024

NASA, SpaceX and Northrop Grumman successfully completed a Tuesday afternoon launch from Cape Canaveral to the International Space Station. Filled with more than 8,200 pounds of supplies, the Cygnus cargo spacecraft, which is carried on the SpaceX Falcon 9 rocket, launched from Space Launch Complex 40. The supplies are expected to reach the ISS on Feb. 1. According to officials, highlights of the space station research facilitated by this delivery include: The first surgical robot on the space stationAn orbit re-entry platform that collects thermal protection systems dataA 3D cartilage cell culture that maintains healthy cartilage in a lower gravityThe MSTIC facility, an autonomous semiconductor manufacturing platformA metal 3D printer that will test the capability for printing small metal partsThe Cygnus spacecraft is scheduled to remain at the ISS until May, when it will depart and harmlessly burn up in the Earth's atmosphere.

NASA, SpaceX and Northrop Grumman successfully completed a Tuesday afternoon launch from Cape Canaveral to the International Space Station.

Filled with more than 8,200 pounds of supplies, the Cygnus cargo spacecraft, which is carried on the SpaceX Falcon 9 rocket, launched from Space Launch Complex 40.

The supplies are expected to reach the ISS on Feb. 1.

According to officials, highlights of the space station research facilitated by this delivery include:

The Cygnus spacecraft is scheduled to remain at the ISS until May, when it will depart and harmlessly burn up in the Earth's atmosphere.

Read the original post:

Cape Canaveral launch: Supplies head to ISS - WESH 2 Orlando

Watch SpaceX launch Cygnus cargo ship to space station – Olean Times Herald

SpaceX Falcon 9 rocket launched a cargo resupply mission Tuesday to the International Space Station from Cape Canaveral Space Force Station in Florida. #spacex #space #cygnus

Thumbnail source: Florida Today

Subscribe for more Breaking News: http://smarturl.it/AssociatedPress

Website: https://apnews.com

Twitter: https://twitter.com/AP

Facebook: https://facebook.com/APNews

Instagram: https://www.instagram.com/APNews/

This video may be available for archive licensing via https://newsroom.ap.org/home

More here:

Watch SpaceX launch Cygnus cargo ship to space station - Olean Times Herald

Lichen Survives on Outside of International Space Station Explorersweb – ExplorersWeb

To ask if you could live outside the International Space Station (ISS) is rhetorical at best but could any living organism on Earth manage it?

One unassuming toughie did, and provided at least rough proof of concept that life could exist on Mars.

Lichen from Antarcticas McMurdo Dry Valleys survived 18 months on a platform attached to the outside of the ISSs Columbus module, Futurism reported. Though they emerged in worse shape than temperate lichens tested separately in Mars-like conditions, many still survived.

The International Space Station. Photo: NASA

The study authors focused on the success of the species in the Martian simulation.

The most relevant outcome was that more than 60% of the cellsremained intact after exposure to Mars, said Rosa de la Torre Noetzel of Spains National Institute of Aerospace Technology (INTA) and co-researcher on the project.

Survival in outer space itself was lower. Only around 35% of these lichens cells retained their membranes throughout the experiment.

Nevertheless, this is strong evidence that lichen is tougher than anything alive by many orders of magnitude.

For carbon-based life forms, outer space is in a word unsurvivable. In no particular order, space is:

However, repeated experiments have proven lichens resistance to these conditions.

In 2005, researchers placed lichens aboard a rocket and then attached them to a European Space Agency module outside a Russian satellite. They left them for 16 days, then brought them back home.

All exposed lichens, regardless of the optical filters used, showed nearly the same photosynthetic activity after the flight, the study said. These findings indicate that [most lichen cells] can survive in space after full exposure to massive UV and cosmic radiation, conditions proven to be lethal to bacteria and other microorganisms.

Read more:

Lichen Survives on Outside of International Space Station Explorersweb - ExplorersWeb

Cloud-Computing in the Post-Serverless Era: Current Trends and Beyond – InfoQ.com

Key Takeaways

[Note: The opinions and predictions in this article are those of the author and not of InfoQ.]

As AWS Lambda approaches its 10th anniversary this year, serverless computing expands beyond just Function as a Service (FaaS). Today, serverless describes cloud services that require no manual provisioning, offer on-demand auto-scaling, and use consumption-based pricing. This shift is part of a broader evolution in cloud computing, with serverless technology continuously transforming. This article focuses on the future beyond serverless, exploring how the cloud landscape will evolve beyond current hyperscaler models and its impact on developers and operations teams. I will examine the top three trends shaping this evolution.

In software development, a "module" or "component" typically refers to a self-contained unit of software that performs a cohesive set of actions. This concept corresponds elegantly to the microservice architecture that typically runs on long-running compute services such as Virtual Machines (VMs) or a container service. AWS EC2, one of the first widely accessible cloud computing services, offered scalable VMs. Introducing such scalable, accessible cloud resources provided the infrastructure necessary for microservices architecture to become practical and widespread. This shift led to decomposing monolithic applications into independently deployable microservice units.

Lets continue with this analogy of software units. A function is a block of code that encapsulates a sequence of statements performing a single task with defined input and output. This unit of code nicely corresponds to the FaaS execution model. The concept of FaaS executing code in response to events without the need to manage infrastructure existed before AWS Lambda but lacked broad implementation and recognition.

The concept of FaaS, which involves executing code in response to events without the need for managing infrastructure, was already suggested by services like Google App Engine, Azure WebJobs, IronWorker, and AWS Elastic Beanstalk before AWS Lambda brought it into the mainstream. Lambda, emerging as the first major commercial implementation of FaaS, acted as a catalyst for its popularity by easing the deployment process for developers. This advancement led to the transformation of microservices into smaller, individually scalable, event-driven operations.

In the evolution toward smaller software units offered as a service, one might wonder if we will see basic programming elements like expressions or statements as a service (such as int x = a + b;). The progression, however, steers away from this path. Instead, we are witnessing the minimization and eventual replacement of functions by configurable cloud constructs. Constructs in software development, encompassing elements like conditionals (if-else, switch statements), loops (for, while), exception handling (try-catch-finally), or user-defined data structures, are instrumental in controlling program flow or managing complex data types. In cloud services, constructs align with capabilities that enable the composition of distributed applications, interlinking software modules such as microservices and functions, and managing data flow between them.

Cloud construct replacing functions, replacing microservices, replacing monolithic applications

While you might have previously used a function to filter, route, batch, split events, or call another cloud service or function, now these operations and more can be done with less code in your functions, or in many cases with no function code at all. They can be replaced by configurable cloud constructs that are part of the cloud services. Lets look at a few concrete examples from AWS to demonstrate this transition from Lambda function code to cloud constructs:

These are just a few examples of application code constructs becoming serverless cloud constructs. Rather than validating input values in a function with if-else logic, you can validate the inputs through configuration. Rather than routing events with a case or switch statement to invoke other code from within a function, you can define routing logic declaratively outside the function. Events can be triggered from data sources on data change, batched, or split without a repetition construct, such as a for or while loop.

Events can be validated, transformed, batched, routed, filtered, and enriched without a function. Failures were handled and directed to DLQs and back without a try-catch code, and successful completion was directed to other functions and service endpoints. Moving these constructs from application code into construct configuration reduces application code size or removes it, eliminating the need for security patching and any maintenance.

A primitive and a construct in programming have distinct meanings and roles. A primitive is a basic data type inherently part of a programming language. It embodies a basic value, such as an integer, float, boolean, or character, and does not comprise other types. Mirroring this concept, the cloud - just like a giant programming runtime - is evolving from infrastructure primitives like network load balancers, virtual machines, file storage, and databases to more refined and configurable cloud constructs.

Like programming constructs, these cloud constructs orchestrate distributed application interactions and manage complex data flows. However, these constructs are not isolated cloud services; there isnt a standalone "filtering as a service" or "event emitter as service." There are no "Constructs as a Service," but they are increasingly essential features of core cloud primitives such as gateways, data stores, message brokers, and function runtimes.

This evolution reduces application code complexity and, in many cases, eliminates the need for custom functions. This shift from FaaS to NoFaaS (no fuss, implying simplicity) is just beginning, with insightful talks and code examples on GitHub. Next, I will explore the emergence of construct-rich cloud services within vertical multi-cloud services.

In the post-serverless cloud era, its no longer enough to offer highly scalable cloud primitives like compute for containers and functions, or storage services such as key/value stores, event stores, relational databases, or networking primitives like load balancers. Post-serverless cloud services must be rich in developer constructs and offload much of the application plumbing. This goes beyond hyperscaling a generic cloud service for a broad user base; it involves deep specialization and exposing advanced constructs to more demanding users.

Hyperscalers like AWS, Azure, GCP, and others, with their vast range of services and extensive user bases, are well-positioned to identify new user needs and constructs. However, providing these more granular developer constructs results in increased complexity. Each new construct in every service requires a deep learning curve with its specifics for effective utilization. Thus, in the post-serverless era, we will observe the rise of vertical multi-cloud services that excel in one area. This shift represents a move toward hyperspecialization of cloud services.

Consider Confluent Cloud as an example. While all major hyperscalers (AWS, Azure, GCP, etc.) offer Kafka services, none match the developer experience and constructs provided by Confluent Cloud. With its Kafka brokers, numerous Kafka connectors, integrated schema registry, Flink processing, data governance, tracing, and message browser, Confluent Cloud delivers the most construct-rich and specialized Kafka service, surpassing what hyperscalers offer.

This trend is not isolated; numerous examples include MongoDB Atlas versus DocumentDB, GitLab versus CodeCommit, DataBricks versus EMR, RedisLabs versus ElasticCache, etc. Beyond established cloud companies, a new wave of startups is emerging, focusing on a single multi-cloud primitive (like specialized compute, storage, networking, build-pipeline, monitoring, etc.) and enriching it with developer constructs to offer a unique value proposition. Here are some cloud services hyperspecializing in a single open-source technology, aiming to provide a construct-rich experience and attract users away from hyperscalers:

This list represents a fraction of a growing ecosystem of hyperspecialized vertical multi-cloud services built atop core cloud primitives offered by hyperscalers. They compete by providing a comprehensive set of programmable constructs and an enhanced developer experience.

Serverless cloud services hyperspecializing in one thing with rich developer constructs

Once this transition is completed, bare-bones cloud services without rich constructs, even serverless ones, will seem like outdated on-premise software. A storage service must stream changes like DynamoDB; a message broker should include EventBridge-like constructs for event-driven routing, filtering, and endpoint invocation with retries and DLQs; a pub/sub system should offer message batching, splitting, filtering, transforming, and enriching.

Ultimately, while hyperscalers expand horizontally with an increasing array of services, hyperspecializers grow vertically, offering a single, best-in-class service enriched with constructs, forming an ecosystem of vertical multi-cloud services. The future of cloud service competition will pivot from infrastructure primitives to a duo of core cloud primitives and developer-centric constructs.

Cloud constructs increasingly blur the boundaries between application and infrastructure responsibilities. The next evolution is the "shift left" of cloud automation, integrating application and automation codes in terms of tools and responsibilities. Lets examine how this transition is unfolding.

The first generation of cloud infrastructure management was defined by Infrastructure as Code (IaC), a pattern that emerged to simplify the provisioning and management of infrastructure. This approach is built on the trends set by the commoditization of virtualization in cloud computing.

The initial IaC tools introduced new domain-specific languages (DSL) dedicated to creating, configuring, and managing cloud resources in a repeatable manner. Tools like Chef, Ansible, Puppet, and Terraform led this phase. These tools, leveraging declarative languages, allowed operation teams to define the infrastructures desired state in code, abstracting underlying complexities.

However, as the cloud landscape transitions from low-level coarse-grained infrastructure to more developer-centric programmable finer-grained constructs, a trend toward using existing general-purpose programming languages for defining these constructs is emerging. New entrants like Pulumi and the AWS Cloud Development Kit (CDK) are at the forefront of this wave, supporting languages such as TypeScript, Python, C#, Go, and Java.

The shift to general-purpose languages is driven by the need to overcome the limitations of declarative languages, which lack expressiveness and flexibility for programmatically defining cloud constructs, and by the shift-left of configuring cloud constructs responsibilities from operations to developers. Unlike the static nature of declarative languages suited for low-level static infrastructure, general-purpose languages enable developers to define dynamic, logic-driven cloud constructs, achieving a closer alignment with application code.

Shifting-left of application composition from infrastructure to developer teams

The post-serverless cloud developers need to implement business logic by creating functions and microservices but also compose them together using programmable cloud constructs. This shapes a broader set of developer responsibilities to develop and compose cloud applications. For example, a code with business logic in a Lambda function would also need routing, filtering, and request transformation configurations in API Gateway.

Another Lambda function may need DynamoDB streaming configuration to stream specific data changes, EventBridge routing, filtering, and enrichment configurations.

A third application may have most of its orchestration logic expressed as a StepFunction where the Lambda code is only a small task. A developer, not a platform engineer or Ops member, can compose these units of code together. Tools such as Pulumi, AWS SDK, and others that enable a developer to use the languages of their choice to implement a function and use the same language to compose its interaction with the cloud environment are best suited for this era.

Platform teams still can use declarative languages, such as Terraform, to govern, secure, monitor, and enable teams in the cloud environments, but developer-focused constructs, combined with developer-focused cloud automation languages, will shift left the cloud constructs and make developer self-service in the cloud a reality.

The transition from DSL to general-purpose languages marks a significant milestone in the evolution of IaC. It acknowledges the transition of application code into cloud constructs, which often require a deeper developer control of the resources for application needs. This shift represents a maturation of IaC tools, which now need to cater to a broader spectrum of infrastructure orchestration needs, paving the way for more sophisticated, higher-level abstractions and tools.

The journey of infrastructure management will see a shift from static configurations to a more dynamic, code-driven approach. This evolution hasnt stopped at Infrastructure as Code; it is transcending into a more nuanced realm known as Composition as Code. This paradigm further blurs the lines between application code and infrastructure, leading to more streamlined, efficient, and developer-friendly practices.

In summarizing the trends and their reinforcing effects, were observing an increasing integration of programming constructs into cloud services. Every compute service will integrate CI/CD pipelines; databases will provide HTTP access from the edge and emit change events; message brokers will enhance capabilities with filtering, routing, idempotency, transformations, DLQs, etc.

Infrastructure services are evolving into serverless APIs, infrastructure inferred from code (IfC), framework-defined infrastructure, or explicitly composed by developers (CaC). This evolution leads to smaller functions and sometimes to NoFaaS pattern, paving the way for hyperspecialized, developer-first vertical multi-cloud services. These services will offer infrastructure as programmable APIs, enabling developers to seamlessly merge their applications using their preferred programming language.

The shift-left of application composition using cloud services will increasingly blend with application programming, transforming microservices from an architectural style to an organizational one. A microservice will no longer be just a single deployment unit or process boundary but a composition of functions, containers, and cloud constructs, all implemented and glued together in a single language chosen by the developer. The future is shaping to be hyperspecialized and focused on the developer-first cloud.

Follow this link:

Cloud-Computing in the Post-Serverless Era: Current Trends and Beyond - InfoQ.com

Cloud Computing Security Start with a ‘North Star’ – ITPro Today

Cloud computing has followed a similar journey to other introductions of popular technology: Adopt first, secure later. Cloud transformation has largely been enabled by IT functions at the request of the business, with security functions often taking a backseat. In some organizations, this has been due to politics and blind faith in the cloud services providers (CSPs), e.g., AWS, Microsoft, and GCP.

In others, it has been because security functions only knew and understood on-premises deployments and simply didn't have the knowledge and capability to securely adapt to cloud or hybrid architectures and translate policies and processes to the cloud. For lucky organizations, this has only led to stalled migrations while the security and IT organizations played catch up. For unlucky organizations, this has led to breaches, business disruption, and loss of data.

Related: What Is Cloud Security?

Cloud security can be complex. However, more often than not, it is ridiculously simple the misconfigured S3 bucket being a prime example. It reached a point where malefactors could simply look for misconfigured S3 buckets to steal data; no need to launch an actual attack.

It's time for organizations take a step back and improve cloud security, and the best way to do this is to put security at the core of cloud transformations, rather than adopting the technology first and asking security questions later. Here are four steps to course correct and implement a security-centric cloud strategy:

Related: Cloud Computing Predictions 2024: What to Expect From FinOps, AI

For multi-cloud users, there is one other aspect of cloud security to consider. Most CSPs are separate businesses, and their services don't work with other CSPs. So, rather than functioning like internet service providers (ISPs) where one provider lets you access the entire internet, not just the sites that the ISP owns CSPs operate in silos, with limited interoperability with their counterparts (e.g., AWS can't manage Azure workloads, security, and services, and vice versa). This is problematic for customers because, once more than one cloud provider is added to the infrastructure, the efficacy in managing cloud operations and cloud security starts to diminish rapidly. Each time another CSP is added to an organization's environment, their attack surface grows exponentially, unless secured appropriately.

It's up to each company to take steps to become more secure in multi-cloud environments. In addition to developing and executing a strong security strategy, they also must consider using third-party applications and platforms such as cloud-native application protection platforms (CNAPPs), cloud security posture management (CSPM), infrastructure as code (IaC), and secrets management to provide the connective tissue between CSPs in hybrid or multi-cloud environments. Taking this vital step will increase security visibility, posture management, and operational efficiency to ensure the security and business results outlined at the start of the cloud security journey.

It should be noted that a cloud security strategy like any other form of security needs to be a "living" plan. The threat landscape and business needs change so fast that what is helpful today may not be helpful tomorrow. To stay in step with your organization's desired state of security, periodically revisit cloud security strategies to understand if they are delivering the desired benefits and make adjustments when they are not.

Cloud computing has transformed organizations of all types. Adopting a strategy for securing this new environment will not only allow security to catch up to technology adoption, it will also dramatically improve the ROI of cloud computing.

Ed Lewis is Secure Cloud Transformation Leader at Optiv.

Read this article:

Cloud Computing Security Start with a 'North Star' - ITPro Today

Global $83.7 Bn Cloud Computing Management and Optimization Market to 2030 with IT and Telecommunications … – PR Newswire

DUBLIN, Jan. 23, 2024 /PRNewswire/ -- The"Global Cloud Computing Management and Optimization Market 2023 - 2030 by Types, Applications - Partner & Customer Ecosystem Competitive Index & Regional Footprints" report has been added to ResearchAndMarkets.com's offering.

The Cloud Computing Management and Optimization Market size is estimated to grow from USD 17.6 Billion in 2022 to reach USD 83.7 Billion by 2030, growing at a CAGR of 21.7% during the forecast period from 2023 to 2030.

The Adoption of Cloud Based Solution Is Drive the Cloud Computing Management and Optimization Market Growth

As businesses migrate their operations to cloud-based ecosystems, as it offers a number of benefits, such as scalability, flexibility, and cost savings. A growing number of companies are adopting cloud computing includingSMEs and Large scale companies, which will lead to an increase in demand for cloud computing management and optimisation solutions.

Cloud computing environments are becoming increasingly complex, as businesses adopt a variety of cloud services from different providers. This complexity can make it difficult for businesses to manage their cloud costs and performance. Cloud computing management and optimization solutions can help businesses to simplify their cloud environments and optimize their costs and performance. Cloud computing can be a cost-effective way for businesses to IT resources.

However, businesses can still incur significant costs if they do not manage their cloud usage effectively. Cloud computing management and optimization solutions can help businesses to track their cloud usage and identify opportunities to optimize their costs. The cloud computing industry is constantly evolving, with the emergence of new technologies, such as artificial intelligence and machine learning. These new technologies can be used to improve the efficiency and effectiveness of cloud computing management and optimization solutions.

The IT and Telecommunications industries hold the highest market share in the Cloud Computing Management and Optimization Market

The IT and Telecommunications industries hold the highest market share in the Cloud Computing Management and Optimization Market in 2022, due to their intrinsic reliance on advanced technology solutions and their pivotal role in driving digital transformation across various sectors. In the IT industry, cloud computing has become a cornerstone for delivering software, platforms, and infrastructure services, enabling organizations to enhance agility, scalability, and operational efficiency.

As IT companies transition their operations to the cloud, the need for effective management and optimization of cloud resources becomes paramount to ensure optimal performance, cost control, and resource allocation. Cloud management and optimization solutions enable IT enterprises to streamline provisioning, monitor workloads, automate processes, and maintain stringent security protocols.

Furthermore, the Telecommunications sector has embraced cloud computing to modernize and expand its network infrastructure, offer innovative communication services, and adapt to the demands of an interconnected world. Cloud-based solutions empower telecom companies to efficiently manage network resources, deliver seamless customer experiences, and explore new revenue streams.

In this context, cloud computing management and optimization are essential for maintaining network reliability, ensuring data privacy, and dynamically scaling resources to meet fluctuating demand. The complex and dynamic nature of both IT and Telecommunications operations necessitates sophisticated tools and strategies for cloud resource management, making these industries prime contributors to the Cloud Computing Management and Optimization Market

Regional Insight: North America dominated the Cloud Computing Management and Optimization Market during the forecast period.

North America dominated the Cloud Computing Management and Optimization Market during the forecast period. Cloud computing has been continuously adopted by the United States and Canada, which are at the forefront of technological development, which helps strengthen North America's remarkable position as market leader. The strong presence of major companies like Adobe, Salesforce, Oracle,AWS, Google, and IBM inside the region's wide geography provides a foundation for this rise. With their cutting-edge solutions, these major players make a significant impact on adoption and innovation.

The region's commitment to technical advancement also serves as another indication of its dominance. Continuous improvements in a number of technologies are transforming the cloud computing industry, and North America is recognized as a hub for important developments.

As a result, organizations and enterprises in North America are pushed to the forefront of cloud optimization and administration, utilizing the full range of technologies and expertise provided by both local and international industry experts. Strong vendor presence, widespread acceptance, and constant technological innovation place North America in the lead for snatching the highest market share during the forecast period.

Major Classifications are as follows:

Cloud Computing Management and Optimization Market, Type of Solutions

Cloud Computing Management and Optimization Market, By Deployment Models

Cloud Computing Management and Optimization Market, By Organization Size

Cloud Computing Management and Optimization Market, By Cloud Service Models

Cloud Computing Management and Optimization Market, By Technologies

Cloud Computing Management and Optimization Market, By Industries

Cloud Computing Management and Optimization Market, By Geography

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/bx3846

About ResearchAndMarkets.com

ResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Media Contact:

Research and Markets Laura Wood, Senior Manager [emailprotected] For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900 U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

Logo: https://mma.prnewswire.com/media/539438/Research_and_Markets_Logo.jpg

SOURCE Research and Markets

Read this article:

Global $83.7 Bn Cloud Computing Management and Optimization Market to 2030 with IT and Telecommunications ... - PR Newswire

The Future of Cloud Computing in Business Operations – Data Science Central

The digital era has witnessed the remarkable evolution of cloud computing, transforming it into a cornerstone of modern business operations. This technology, which began as a simple concept of centralized data storage, has now evolved into a complex and dynamic ecosystem, enabling businesses to operate more efficiently and effectively than ever before. The Future of Cloud Computing holds unparalleled potential, promising to revolutionize the way companies operate, innovate, and compete in the global market.

Cloud computing refers to the delivery of various services over the Internet, including data storage, servers, databases, networking, and software. Rather than owning their computing infrastructure or data centers, companies can rent access to anything from applications to storage from a cloud service provider.

Cloud computing has revolutionized the way businesses operate, offering a plethora of advantages that enhance efficiency, flexibility, and scalability. In this discussion, well delve into the key benefits of cloud computing, explaining each in simple terms and underlining their significance in todays business landscape.

Cloud computing significantly cuts down on the capital cost associated with purchasing hardware and software, especially in sectors like the healthcare industry. Its an economical alternative to owning and maintaining extensive IT infrastructure, allowing businesses, including those in the healthcare sector, to save on setup and maintenance costs. This aspect is particularly beneficial in cloud computing in healthcare industry, where resources can instead be allocated toward patient care and medical research.

The ability to scale resources elastically with cloud computing is akin to having a flexible and adaptable IT infrastructure. Businesses can efficiently scale up or down their IT resources based on current demand, ensuring optimal utilization and avoiding wastage.

Cloud services are hosted on a network of secure, high-performance data centers globally, offering superior performance over traditional single corporate data centers. This global network ensures reduced latency, better application performance, and economies of scale.

Cloud computing facilitates a swift and agile business environment. Companies can quickly roll out new applications or resources, empowering them to respond swiftly to market changes and opportunities.

The efficiency and speed offered by cloud computing translate into enhanced productivity. Reduced network latency ensures applications and services run smoothly, enabling teams to achieve more in less time.

Cloud computing enhances collaboration by enabling team members to share and work on data and files simultaneously from any location. This virtual collaboration space is crucial for businesses with remote teams and global operations.

Here, we explore the transformative role of cloud computing in business, focusing on 7 key points that forecast its future impact and potential in streamlining and innovating operational landscapes.

In the Future of Cloud Computing, handling enormous amounts of data will become more critical than ever. Businesses of all sizes generate data at unprecedented rates. From customer interactions to transaction records, every piece of data is a potential goldmine of insights. Cloud computing steps in as the ideal solution to manage this surge efficiently.

Cloud storage provides a scalable and flexible way to store and access vast datasets. As we move forward, cloud providers will likely offer more tailored storage solutions, catering to different business needs. Whether its for high-frequency access or long-term archiving, cloud storage can adapt to various requirements.

Another significant aspect of data management in the Future of Cloud Computing is real-time data processing. Businesses will rely on cloud computing not just for storage, but also for the immediate processing and analysis of data. This capability allows for quicker decision-making, a crucial factor in maintaining a competitive edge.

One of the most transformative impacts of cloud computing is its ability to transcend geographical boundaries. In the Future of Cloud Computing, remote and global teams can collaborate as if they were in the same room. Cloud-based tools and platforms allow team members from different parts of the world to work on projects simultaneously, share files instantaneously, and communicate in real-time.

In the Future of Cloud Computing, we can expect a rise in virtual workspaces. These digital environments simulate physical offices, providing a space where remote workers can feel connected and engaged. They offer features like virtual meeting rooms, shared digital whiteboards, and social areas, replicating the office experience in a digital realm.

Cloud computing does more than just streamline operations; it also opens doors to innovation. With cloud resources, businesses can experiment with new ideas without significant upfront investment in infrastructure. This flexibility encourages creativity and risk-taking, which are essential for innovation.

Cloud computing accelerates the product development cycle. Teams can quickly set up and dismantle test environments, prototype more efficiently, and bring products to market faster. This agility gives businesses a significant advantage in rapidly evolving markets.

The landscape of cloud computing is rapidly evolving, with new trends constantly emerging to redefine how businesses leverage this technology. In the context of the future of cloud computing, 3 key trends stand out for their potential to significantly shape the industry. Understanding these trends is crucial for businesses looking to stay competitive and innovative.

Artificial Intelligence (AI) and Machine Learning (ML) are becoming increasingly integral to cloud computing. This integration is revolutionizing how cloud services are delivered and utilized. AI algorithms are enhancing the efficiency of cloud platforms, offering smarter data analytics, automating routine tasks, and providing more personalized user experiences. For instance, cloud-based AI services can analyze vast amounts of data to predict market trends, customer behavior, or potential system failures, offering invaluable insights for businesses.

This integration not only boosts the performance and scalability of cloud solutions but also opens up new avenues for innovation across various sectors.

As cloud computing becomes more prevalent, the focus on security and compliance is intensifying. The increasing frequency and sophistication of cyber threats make robust cloud security a top priority for businesses. In response, cloud service providers are investing heavily in advanced security measures, such as enhanced encryption techniques, identity and access management (IAM), and AI-powered threat detection systems.

Furthermore, with regulations like GDPR and CCPA in place, compliance has become a critical aspect of cloud services. The future of cloud computing will likely witness a surge in cloud solutions that are not only secure but also compliant with various global and industry-specific regulations. This trend ensures that businesses can confidently and safely leverage the cloud while adhering to legal and ethical standards.

Sustainability is a growing concern in the tech world, and cloud computing is no exception. There is an increasing trend towards green cloud computing, focusing on reducing the environmental impact of cloud services. This involves optimizing data centers for energy efficiency, using renewable energy sources, and implementing more sustainable operational practices.

It will likely see a stronger emphasis on sustainability as businesses and consumers become more environmentally conscious. Cloud providers who prioritize and implement eco-friendly practices will not only contribute to a healthier planet but also appeal to a growing segment of environmentally-aware customers.

The future of cloud computing is bright and offers a plethora of opportunities for businesses to grow and evolve. By staying informed and adapting to these changes, companies can leverage cloud computing to gain a competitive edge in the market.

Remember, the future of cloud computing isnt just about technology; its about how businesses can harness this technology to drive innovation, efficiency, and growth.

For businesses aiming to thrive in the ever-changing digital world, embracing the advancements in cloud computing is not just a choice but a necessity. Staying updated and adaptable will be key to harnessing the power of cloud computing for business success in the years to come.

Originally posted here:

The Future of Cloud Computing in Business Operations - Data Science Central

AWS to invest $15bn in cloud computing in Japan – DatacenterDynamics

Amazon Web Services (AWS) is planning to invest 2.26 trillion yen ($15.24 billion) in expanding its cloud computing infrastructure in Japan by 2027.

As part of this investment, the company will seek to expand its data center facilities in Tokyo and Osaka.

The cloud giant previously invested 1.51 trillion yen (~$10.2bn) between 2011 and 2022 in the country. Yearly, this works out at just under $1bn spent per year. The new announcement will see this increase to more than $5bn a year for the next three years.

"The adoption of digital technology has become a source of a countrys competitiveness, said Takuya Hirai, former digital minister and current chair of headquarters for the promotion of a digital society in Japans Liberal Democratic Party.

The development of digital infrastructure in Japan is key to strengthening the country's industrial competitiveness, and data centers play an important role to this end. It promotes the use of important technologies such as AI [artificial intelligence] and improves the capabilities of research and development in Japan."

The digital infrastructure in the country is also the backbone of AWS' artificial intelligence solutions. AWS provides generative AI services to Japanese customers including Asahi Group, Marubeni, and Nomura Holdings.

AWS first entered Japan in 2009. The company launched its first cloud region in the country in 2011 in Tokyo, and another in Osaka in 2021.

Amazon's Bedrock AI offering was made available in Tokyo in October 2023. The company also invested $100m in a generative AI innovation center in June 2023.

It is currently estimated that the latest investment will contribute 5.57 trillion yen (~$37.6bn) to Japans GDP and support an average of 30,500 full-time jobs in Japanese businesses each year.

Japan's government is seeking to catch up in AI development. Prime Minister Fumio Kishida has met with the heads of OpenAI and Nvidia in the past year to discuss AI regulation and infrastructure.

In December 2023, Minister Ken Saito announced the government would double down on its pledge to support the domestic chip manufacturing industry.

Follow this link:

AWS to invest $15bn in cloud computing in Japan - DatacenterDynamics

Why is Application Mapping Important in Cloud Computing? – Techopedia

This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.

Original post:

Why is Application Mapping Important in Cloud Computing? - Techopedia