Americans’ use of ChatGPT is ticking up, but few trust its election information – Pew Research Center

Its been more than a year since ChatGPTs public debut set the tech world abuzz. And Americans use of the chatbot is ticking up: 23% of U.S. adults say they have ever used it, according to a Pew Research Center survey conducted in February, up from 18% in July 2023.

The February survey also asked Americans about several ways they might use ChatGPT, including for workplace tasks, for learning and for fun. While growing shares of Americans are using the chatbot for these purposes, the public is more wary than not of what the chatbot might tell them about the 2024 U.S. presidential election. About four-in-ten adults have not too much or no trust in the election information that comes from ChatGPT. By comparison, just 2% have a great deal or quite a bit of trust.

Pew Research Center conducted this study to understand Americans use of ChatGPT and their attitudes about the chatbot. For this analysis, we surveyed 10,133 U.S. adults from Feb. 7 to Feb. 11, 2024.

Everyone who took part in the survey is a member of the Centers American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of residential addresses. This way, nearly all U.S. adults have a chance of selection. The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, partisan affiliation, education and other categories. Read more about the ATPs methodology.

Here are the questions used for this analysis, along with responses, and the survey methodology.

Below well look more closely at:

Most Americans still havent used the chatbot, despite the uptick since our July 2023 survey on this topic. But some groups remain far more likely to have used it than others.

Differences by age

Adults under 30 stand out: 43% of these young adults have used ChatGPT, up 10 percentage points since last summer. Use of the chatbot is also up slightly among those ages 30 to 49 and 50 to 64. Still, these groups remain less likely than their younger peers to have used the technology. Just 6% of Americans 65 and up have used ChatGPT.

Differences by education

Highly educated adults are most likely to have used ChatGPT: 37% of those with a postgraduate or other advanced degree have done so, up 8 points since July 2023. This group is more likely to have used ChatGPT than those with a bachelors degree only (29%), some college experience (23%) or a high school diploma or less (12%).

Since March 2023, weve also tracked three potential reasons Americans might use ChatGPT: for work, to learn something new or for entertainment.

The share of employed Americans who have used ChatGPT on the job increased from 8% in March 2023 to 20% in February 2024, including an 8-point increase since July.

Turning to U.S. adults overall, about one-in-five have used ChatGPT to learn something new (17%) or for entertainment (17%). These shares have increased from about one-in-ten in March 2023.

Differences by age

Use of ChatGPT for work, learning or entertainment has largely risen across age groups over the past year. Still, there are striking differences between these groups (those 18 to 29, 30 to 49, and 50 and older).

For example, about three-in-ten employed adults under 30 (31%) say they have used it for tasks at work up 19 points from a year ago, with much of that increase happening since July. These younger workers are more likely than their older peers to have used ChatGPT in this way.

Adults under 30 also stand out in using the chatbot for learning. And when it comes to entertainment, those under 50 are more likely than older adults to use ChatGPT for this purpose.

Differences by education

A third of employed Americans with a postgraduate degree have used ChatGPT for work, compared with smaller shares of workers who have a bachelors degree only (25%), some college (19%) or a high school diploma or less (8%).

Those shares have each roughly tripled since March 2023 for workers with a postgraduate degree, bachelors degree or some college. Among workers with a high school diploma or less, use is statistically unchanged from a year ago.

Using ChatGPT for other purposes also varies by education level, though the patterns are slightly different. For example, a quarter each of postgraduate and bachelors degree-holders have used ChatGPT for learning, compared with 16% of those with some college experience and 11% of those with a high school diploma or less education. Each of these shares is up from a year ago.

With more people using ChatGPT, we also wanted to understand whether Americans trust the information they get from it, particularly in the context of U.S. politics.

About four-in-ten Americans (38%) dont trust the information that comes from ChatGPT about the 2024 U.S. presidential election that is, they say they have not too much trust (18%) or no trust at all (20%).

A mere 2% have a great deal or quite a bit of trust, while 10% have some trust.

Another 15% arent sure, while 34% have not heard of ChatGPT.

Distrust far outweighs trust regardless of political party. About four-in-ten Republicans and Democrats alike (including those who lean toward each party) have not too much or no trust at all in ChatGPTs election information.

Notably, however, very few Americans have actually used the chatbot to find information about the presidential election: Just 2% of adults say they have done so, including 2% of Democrats and Democratic-leaning independents and 1% of Republicans and GOP leaners.

These survey findings come amid growing national attention on chatbots and misinformation. Several tech companies have recently pledged to prevent the misuse of artificial intelligence including chatbots in this years election. But recent reports suggest chatbots themselves may provide misleading answers to election-related questions.

Note: Here are the questions used for this analysis, along with responses, and the survey its methodology.

Continue reading here:

Americans' use of ChatGPT is ticking up, but few trust its election information - Pew Research Center

Cloud-Computing in the Post-Serverless Era: Current Trends and Beyond – InfoQ.com

Key Takeaways

[Note: The opinions and predictions in this article are those of the author and not of InfoQ.]

As AWS Lambda approaches its 10th anniversary this year, serverless computing expands beyond just Function as a Service (FaaS). Today, serverless describes cloud services that require no manual provisioning, offer on-demand auto-scaling, and use consumption-based pricing. This shift is part of a broader evolution in cloud computing, with serverless technology continuously transforming. This article focuses on the future beyond serverless, exploring how the cloud landscape will evolve beyond current hyperscaler models and its impact on developers and operations teams. I will examine the top three trends shaping this evolution.

In software development, a "module" or "component" typically refers to a self-contained unit of software that performs a cohesive set of actions. This concept corresponds elegantly to the microservice architecture that typically runs on long-running compute services such as Virtual Machines (VMs) or a container service. AWS EC2, one of the first widely accessible cloud computing services, offered scalable VMs. Introducing such scalable, accessible cloud resources provided the infrastructure necessary for microservices architecture to become practical and widespread. This shift led to decomposing monolithic applications into independently deployable microservice units.

Lets continue with this analogy of software units. A function is a block of code that encapsulates a sequence of statements performing a single task with defined input and output. This unit of code nicely corresponds to the FaaS execution model. The concept of FaaS executing code in response to events without the need to manage infrastructure existed before AWS Lambda but lacked broad implementation and recognition.

The concept of FaaS, which involves executing code in response to events without the need for managing infrastructure, was already suggested by services like Google App Engine, Azure WebJobs, IronWorker, and AWS Elastic Beanstalk before AWS Lambda brought it into the mainstream. Lambda, emerging as the first major commercial implementation of FaaS, acted as a catalyst for its popularity by easing the deployment process for developers. This advancement led to the transformation of microservices into smaller, individually scalable, event-driven operations.

In the evolution toward smaller software units offered as a service, one might wonder if we will see basic programming elements like expressions or statements as a service (such as int x = a + b;). The progression, however, steers away from this path. Instead, we are witnessing the minimization and eventual replacement of functions by configurable cloud constructs. Constructs in software development, encompassing elements like conditionals (if-else, switch statements), loops (for, while), exception handling (try-catch-finally), or user-defined data structures, are instrumental in controlling program flow or managing complex data types. In cloud services, constructs align with capabilities that enable the composition of distributed applications, interlinking software modules such as microservices and functions, and managing data flow between them.

Cloud construct replacing functions, replacing microservices, replacing monolithic applications

While you might have previously used a function to filter, route, batch, split events, or call another cloud service or function, now these operations and more can be done with less code in your functions, or in many cases with no function code at all. They can be replaced by configurable cloud constructs that are part of the cloud services. Lets look at a few concrete examples from AWS to demonstrate this transition from Lambda function code to cloud constructs:

These are just a few examples of application code constructs becoming serverless cloud constructs. Rather than validating input values in a function with if-else logic, you can validate the inputs through configuration. Rather than routing events with a case or switch statement to invoke other code from within a function, you can define routing logic declaratively outside the function. Events can be triggered from data sources on data change, batched, or split without a repetition construct, such as a for or while loop.

Events can be validated, transformed, batched, routed, filtered, and enriched without a function. Failures were handled and directed to DLQs and back without a try-catch code, and successful completion was directed to other functions and service endpoints. Moving these constructs from application code into construct configuration reduces application code size or removes it, eliminating the need for security patching and any maintenance.

A primitive and a construct in programming have distinct meanings and roles. A primitive is a basic data type inherently part of a programming language. It embodies a basic value, such as an integer, float, boolean, or character, and does not comprise other types. Mirroring this concept, the cloud - just like a giant programming runtime - is evolving from infrastructure primitives like network load balancers, virtual machines, file storage, and databases to more refined and configurable cloud constructs.

Like programming constructs, these cloud constructs orchestrate distributed application interactions and manage complex data flows. However, these constructs are not isolated cloud services; there isnt a standalone "filtering as a service" or "event emitter as service." There are no "Constructs as a Service," but they are increasingly essential features of core cloud primitives such as gateways, data stores, message brokers, and function runtimes.

This evolution reduces application code complexity and, in many cases, eliminates the need for custom functions. This shift from FaaS to NoFaaS (no fuss, implying simplicity) is just beginning, with insightful talks and code examples on GitHub. Next, I will explore the emergence of construct-rich cloud services within vertical multi-cloud services.

In the post-serverless cloud era, its no longer enough to offer highly scalable cloud primitives like compute for containers and functions, or storage services such as key/value stores, event stores, relational databases, or networking primitives like load balancers. Post-serverless cloud services must be rich in developer constructs and offload much of the application plumbing. This goes beyond hyperscaling a generic cloud service for a broad user base; it involves deep specialization and exposing advanced constructs to more demanding users.

Hyperscalers like AWS, Azure, GCP, and others, with their vast range of services and extensive user bases, are well-positioned to identify new user needs and constructs. However, providing these more granular developer constructs results in increased complexity. Each new construct in every service requires a deep learning curve with its specifics for effective utilization. Thus, in the post-serverless era, we will observe the rise of vertical multi-cloud services that excel in one area. This shift represents a move toward hyperspecialization of cloud services.

Consider Confluent Cloud as an example. While all major hyperscalers (AWS, Azure, GCP, etc.) offer Kafka services, none match the developer experience and constructs provided by Confluent Cloud. With its Kafka brokers, numerous Kafka connectors, integrated schema registry, Flink processing, data governance, tracing, and message browser, Confluent Cloud delivers the most construct-rich and specialized Kafka service, surpassing what hyperscalers offer.

This trend is not isolated; numerous examples include MongoDB Atlas versus DocumentDB, GitLab versus CodeCommit, DataBricks versus EMR, RedisLabs versus ElasticCache, etc. Beyond established cloud companies, a new wave of startups is emerging, focusing on a single multi-cloud primitive (like specialized compute, storage, networking, build-pipeline, monitoring, etc.) and enriching it with developer constructs to offer a unique value proposition. Here are some cloud services hyperspecializing in a single open-source technology, aiming to provide a construct-rich experience and attract users away from hyperscalers:

This list represents a fraction of a growing ecosystem of hyperspecialized vertical multi-cloud services built atop core cloud primitives offered by hyperscalers. They compete by providing a comprehensive set of programmable constructs and an enhanced developer experience.

Serverless cloud services hyperspecializing in one thing with rich developer constructs

Once this transition is completed, bare-bones cloud services without rich constructs, even serverless ones, will seem like outdated on-premise software. A storage service must stream changes like DynamoDB; a message broker should include EventBridge-like constructs for event-driven routing, filtering, and endpoint invocation with retries and DLQs; a pub/sub system should offer message batching, splitting, filtering, transforming, and enriching.

Ultimately, while hyperscalers expand horizontally with an increasing array of services, hyperspecializers grow vertically, offering a single, best-in-class service enriched with constructs, forming an ecosystem of vertical multi-cloud services. The future of cloud service competition will pivot from infrastructure primitives to a duo of core cloud primitives and developer-centric constructs.

Cloud constructs increasingly blur the boundaries between application and infrastructure responsibilities. The next evolution is the "shift left" of cloud automation, integrating application and automation codes in terms of tools and responsibilities. Lets examine how this transition is unfolding.

The first generation of cloud infrastructure management was defined by Infrastructure as Code (IaC), a pattern that emerged to simplify the provisioning and management of infrastructure. This approach is built on the trends set by the commoditization of virtualization in cloud computing.

The initial IaC tools introduced new domain-specific languages (DSL) dedicated to creating, configuring, and managing cloud resources in a repeatable manner. Tools like Chef, Ansible, Puppet, and Terraform led this phase. These tools, leveraging declarative languages, allowed operation teams to define the infrastructures desired state in code, abstracting underlying complexities.

However, as the cloud landscape transitions from low-level coarse-grained infrastructure to more developer-centric programmable finer-grained constructs, a trend toward using existing general-purpose programming languages for defining these constructs is emerging. New entrants like Pulumi and the AWS Cloud Development Kit (CDK) are at the forefront of this wave, supporting languages such as TypeScript, Python, C#, Go, and Java.

The shift to general-purpose languages is driven by the need to overcome the limitations of declarative languages, which lack expressiveness and flexibility for programmatically defining cloud constructs, and by the shift-left of configuring cloud constructs responsibilities from operations to developers. Unlike the static nature of declarative languages suited for low-level static infrastructure, general-purpose languages enable developers to define dynamic, logic-driven cloud constructs, achieving a closer alignment with application code.

Shifting-left of application composition from infrastructure to developer teams

The post-serverless cloud developers need to implement business logic by creating functions and microservices but also compose them together using programmable cloud constructs. This shapes a broader set of developer responsibilities to develop and compose cloud applications. For example, a code with business logic in a Lambda function would also need routing, filtering, and request transformation configurations in API Gateway.

Another Lambda function may need DynamoDB streaming configuration to stream specific data changes, EventBridge routing, filtering, and enrichment configurations.

A third application may have most of its orchestration logic expressed as a StepFunction where the Lambda code is only a small task. A developer, not a platform engineer or Ops member, can compose these units of code together. Tools such as Pulumi, AWS SDK, and others that enable a developer to use the languages of their choice to implement a function and use the same language to compose its interaction with the cloud environment are best suited for this era.

Platform teams still can use declarative languages, such as Terraform, to govern, secure, monitor, and enable teams in the cloud environments, but developer-focused constructs, combined with developer-focused cloud automation languages, will shift left the cloud constructs and make developer self-service in the cloud a reality.

The transition from DSL to general-purpose languages marks a significant milestone in the evolution of IaC. It acknowledges the transition of application code into cloud constructs, which often require a deeper developer control of the resources for application needs. This shift represents a maturation of IaC tools, which now need to cater to a broader spectrum of infrastructure orchestration needs, paving the way for more sophisticated, higher-level abstractions and tools.

The journey of infrastructure management will see a shift from static configurations to a more dynamic, code-driven approach. This evolution hasnt stopped at Infrastructure as Code; it is transcending into a more nuanced realm known as Composition as Code. This paradigm further blurs the lines between application code and infrastructure, leading to more streamlined, efficient, and developer-friendly practices.

In summarizing the trends and their reinforcing effects, were observing an increasing integration of programming constructs into cloud services. Every compute service will integrate CI/CD pipelines; databases will provide HTTP access from the edge and emit change events; message brokers will enhance capabilities with filtering, routing, idempotency, transformations, DLQs, etc.

Infrastructure services are evolving into serverless APIs, infrastructure inferred from code (IfC), framework-defined infrastructure, or explicitly composed by developers (CaC). This evolution leads to smaller functions and sometimes to NoFaaS pattern, paving the way for hyperspecialized, developer-first vertical multi-cloud services. These services will offer infrastructure as programmable APIs, enabling developers to seamlessly merge their applications using their preferred programming language.

The shift-left of application composition using cloud services will increasingly blend with application programming, transforming microservices from an architectural style to an organizational one. A microservice will no longer be just a single deployment unit or process boundary but a composition of functions, containers, and cloud constructs, all implemented and glued together in a single language chosen by the developer. The future is shaping to be hyperspecialized and focused on the developer-first cloud.

Follow this link:

Cloud-Computing in the Post-Serverless Era: Current Trends and Beyond - InfoQ.com