Sailing the Greek Islands Greece sailing vacations

Greece yacht vacations are your gateway to exploring the Cyclades, Well take you to the worlds outstanding cruising grounds of the Cyclades, south of Greece, whichare considered some of the most beautiful islands in the world. Mountains, white cubistic villages and black and golden sand beaches combine to make the islands picture postcard-perfect.

There are 220Cyclades islands, small or big ones. Thats why the Cyclades islands are the main sailing destination area when we are talking about sailing vacationsin Greece.

Chartering a small yacht in Greece for private sailing around the Greek islands, you can customize your itinerary according to your tastes. Your skipper will be happy to guide you around the finest secluded coves, charming tiny harbors, or more crowded islands. Visit pilgrimage sites, experience secluded pearls, crystal clear waters, and warm evenings.

Some of our boats are available only for 7 days private Greek island sailing trips and some others also for short sailing holidays(5 days, 4 days or 1-day private sailing cruise from Paros the Cyclades Greece)

Starting from Paros we suggest the following cruises in the Greek Islands On Your Private Yacht

View post:

Sailing the Greek Islands Greece sailing vacations

Islands for Sale in Canada – Private Islands Online

ISLANDS FOR SALE IN

Island buffs acknowledge Canada as the place with more private islands for sale than any other country in the world. Unlike many island regions around the globe, foreign ownership is always secure and Canada's political climate remains stable at all times. Further, Canada has almost no res... + Read Moretrictions on foreigners purchasing property and the process is relatively simple. While the procedure for buying an island in Canada is fairly straight forward, it is always advisable to retain the services of a qualified lawyer and a knowledgeable real estate agent or broker.

Canada's islands cost a fraction of properties located in the U.S. and provide ideal opportunities for owners seeking friendly locals, efficient transportation and stable infrastructure. Canada's hot summers and temperate spring and fall facilitate every variety of water sport, especially fishing and sailing. Canada's coasts and lakes are dotted with accommodating yacht clubs and marinas. Land lubbers can enjoy Canada's spectacular hiking, abundant wildlife, or numerous golf courses.

Canada has an East and West coast with literally thousands of lakes in between. On the West coast, the majority of islands are located in the Gulf Islands, between Vancouver Island and the Pacific Coast of British Columbia. In central Canada, the archipelagos of Ontario's Georgian Bay on Lake Huron, the Lake of the Woods and the Thousand Islands regions boast spectacular fresh water island properties. Nova Scotia's peninsular Atlantic coast is a popular market with islands available around the more remote Cape Breton island and the more affluent South Shore. - Read Less

See original here:

Islands for Sale in Canada - Private Islands Online

What Is Cryonics? – How Cryonics Works | HowStuffWorks

Cryonics is the practice of preserving human bodies in extremely cold temperatures with the hope of reviving them sometime in the future. The idea is that, if someone has "died" from a disease that is incurable today, he or she can be "frozen" and then revived in the future when a cure has been discovered. A person preserved this way is said to be in cryonic suspension.

To understand the technology behind cryonics, think about the news stories you've heard of people who have fallen into an icy lake and have been submerged for up to an hour in the frigid water before being rescued. The ones who survived did so because the icy water put their body into a sort of suspended animation, slowing down their metabolism and brain function to the point where they needed almost no oxygen.

Cryonics is a bit different from being resuscitated after falling into an icy lake, though. First of all, it's illegal to perform cryonic suspension on someone who is still alive. People who undergo this procedure must first be pronounced legally dead -- that is, their heart must have stopped beating. But if they're dead, how can they ever be revived? According to scientists who perform cryonics, "legally dead" is not the same as "totally dead." Total death, they say, is the point at which all brain function ceases. Legal death occurs when the heart has stopped beating, but some cellular brain function remains. Cryonics preserves the little cell function that remains so that, theoretically, the person can be resuscitated in the future.

Visit link:

What Is Cryonics? - How Cryonics Works | HowStuffWorks

Ted Williams – Wikipedia

American baseball player (19182002)

Baseball player

Williams in 1949

As manager

Theodore Samuel Williams (August 30, 1918 July 5, 2002) was an American professional baseball player and manager. He played his entire 19-year Major League Baseball (MLB) career, primarily as a left fielder, for the Boston Red Sox from 1939 to 1960; his career was interrupted by military service during World War II and the Korean War. Nicknamed "Teddy Ballgame", "the Kid", "the Splendid Splinter", and "The Thumper", Williams is regarded as one of the greatest hitters in baseball history and to date is the last player to hit over .400 in a season.

Williams was a nineteen-time All-Star,[1] a two-time recipient of the American League (AL) Most Valuable Player Award, a six-time AL batting champion, and a two-time Triple Crown winner. He finished his playing career with a .344 batting average, 521 home runs, and a .482 on-base percentage, the highest of all time. His career batting average is the highest of any MLB player whose career was played primarily in the live-ball era, and ranks tied for 7th all-time (with Billy Hamilton).

Born and raised in San Diego, Williams played baseball throughout his youth. After joining the Red Sox in 1939, he immediately emerged as one of the sport's best hitters. In 1941, Williams posted a .406 batting average; he is the last MLB player to bat over .400 in a season. He followed this up by winning his first Triple Crown in 1942. Williams was required to interrupt his baseball career in 1943 to serve three years in the United States Navy and Marine Corps during World War II. Upon returning to MLB in 1946, Williams won his first AL MVP Award and played in his only World Series. In 1947, he won his second Triple Crown. Williams was returned to active military duty for portions of the 1952 and 1953 seasons to serve as a Marine combat aviator in the Korean War. In 1957 and 1958 at the ages of 39 and 40, respectively, he was the AL batting champion for the fifth and sixth time.

Williams retired from playing in 1960. He was inducted into the Baseball Hall of Fame in 1966, in his first year of eligibility.[2] Williams managed the Washington Senators/Texas Rangers franchise from 1969 to 1972. An avid sport fisherman, he hosted a television program about fishing, and was inducted into the IGFA Fishing Hall of Fame.[3] Williams's involvement in the Jimmy Fund helped raise millions in dollars for cancer care and research. In 1991, President George H. W. Bush presented Williams with the Presidential Medal of Freedom, the highest civilian award bestowed by the United States government. He was selected for the Major League Baseball All-Time Team in 1997 and the Major League Baseball All-Century Team in 1999.

Williams was born in San Diego on August 30, 1918,[4] and named Theodore Samuel Williams after former president Theodore Roosevelt as well as his father, Samuel Stuart Williams.[5] He later amended his birth certificate, removing his middle name,[5] which he claimed originated from a maternal uncle (whose actual name was Daniel Venzor), who had been killed in World War I.[6] His father was a soldier, sheriff, and photographer from Ardsley, New York,[7] while his mother, May Venzor, a Spanish-Mexican-American from El Paso, Texas, was an evangelist and lifelong soldier in the Salvation Army.[5] Williams resented his mother's long hours working in the Salvation Army,[8] and Williams and his brother cringed when she took them to the Army's street-corner revivals.[9]

Williams's paternal ancestors were a mix of Welsh, English, and Irish. The maternal, Spanish-Mexican side of Williams's family was quite diverse, having Spanish (Basque), Russian, and American Indian roots.[10] Of his Mexican ancestry he said that "If I had my mother's name, there is no doubt I would have run into problems in those days, [considering] the prejudices people had in Southern California."[11]

Williams lived in San Diego's North Park neighborhood (4121 Utah Street).[12] At the age of eight, he was taught how to throw a baseball by his uncle, Saul Venzor. Saul was one of his mother's four brothers, as well as a former semi-professional baseball player who had pitched against Babe Ruth, Lou Gehrig, and Joe Gordon in an exhibition game.[13][14] As a child, Williams's heroes were Pepper Martin of the St. Louis Cardinals and Bill Terry of the New York Giants.[15] Williams graduated from Herbert Hoover High School in San Diego, where he played baseball as a pitcher and was the star of the team.[16] During this time, he also played American Legion Baseball, later being named the 1960 American Legion Baseball Graduate of the Year.[17]

Though he had offers from the St. Louis Cardinals and the New York Yankees while he was still in high school,[18] his mother thought he was too young to leave home, so he signed up with the local minor league club, the San Diego Padres.[19]

Throughout his career, Williams stated his goal was to have people point to him and remark, "There goes Ted Williams, the greatest hitter who ever lived."[20]

Williams played back-up behind Vince DiMaggio and Ivey Shiver on the (then) Pacific Coast League's San Diego Padres. While in the Pacific Coast League in 1936, Williams met future teammates and friends Dom DiMaggio and Bobby Doerr, who were on the Pacific Coast League's San Francisco Seals.[21] When Shiver announced he was quitting to become a high school football coach in Savannah, Georgia, the job, by default, was open for Williams.[22] Williams posted a .271 batting average on 107 at bats in 42 games for the Padres in 1936.[22] Unknown to Williams, he had caught the eye of the Boston Red Sox's general manager, Eddie Collins, while Collins was scouting Bobby Doerr and the shortstop George Myatt in August 1936.[22][23] Collins later explained, "It wasn't hard to find Ted Williams. He stood out like a brown cow in a field of white cows."[22] In the 1937 season, after graduating from Hoover High in the winter, Williams finally broke into the line-up on June 22, when he hit an inside-the-park home run to help the Padres win 32. The Padres ended up winning the PCL title, while Williams ended up hitting .291 with 23 home runs.[22] Meanwhile, Collins kept in touch with Padres general manager Bill Lane, calling him two times throughout the season. In December 1937, during the winter meetings, the deal was made between Lane and Collins, sending Williams to the Boston Red Sox and giving Lane $35,000 and two major leaguers, Dom D'Allessandro and Al Niemiec, and two other minor leaguers.[24][25]

In 1938, the 19-year-old Williams was 10 days late to spring training camp in Sarasota, Florida, because of a flood in California that blocked the railroads. Williams had to borrow $200 from a bank to make the trip from San Diego to Sarasota.[26] Also during spring training Williams was nicknamed "the Kid" by Red Sox equipment manager Johnny Orlando, who after Williams arrived to Sarasota for the first time, said, "'The Kid' has arrived". Orlando still called Williams "the Kid" 20 years later,[26] and the nickname stuck with Williams the rest of his life.[27] Williams remained in major league spring training for about a week.[26] Williams was then sent to the Double-A-league Minneapolis Millers.[28] While in the Millers training camp for the springtime, Williams met Rogers Hornsby, who had hit over .400 three times, including a .424 average in 1924.[29] Hornsby, who was a coach for the Millers that spring,[29] gave Williams useful advice, including how to "get a good pitch to hit".[28] Talking with the game's greats would become a pattern for Williams, who also talked with Hugh Duffy, who hit .438 in 1894, Bill Terry who hit .401 in 1930, and Ty Cobb with whom he would argue that a batter should hit up on the ball, opposed to Cobb's view that a batter should hit down on the ball.[30]

While in Minnesota, Williams quickly became the team's star.[31] He collected his first hit in the Millers' first game of the season, as well as his first and second home runs during his third game. Both were inside-the-park home runs, with the second traveling an estimated 500 feet (150m) on the fly to a 512-foot (156m) center field fence.[31] Williams later had a 22 game hitting streak that lasted from Memorial Day through mid-June.[31] While the Millers ended up sixth place in an eight-team race,[31] Williams ended up hitting .366 with 46 home runs and 142 RBIs. He received the American Association's Triple Crown and finished second in the voting for Most Valuable Player.[32]

Williams came to spring training three days late in 1939, thanks to Williams driving from California to Florida, as well as respiratory problems, the latter of which would plague Williams for the rest of his career.[33] In the winter, the Red Sox traded right fielder Ben Chapman to the Cleveland Indians to make room for Williams on the roster, even though Chapman had hit .340 in the previous season.[34][35] This led Boston Globe sports journalist Gerry Moore to quip, "Not since Joe DiMaggio broke in with the Yankees by "five for five" in St. Petersburg in 1936 has any baseball rookie received the nationwide publicity that has been accorded this spring to Theodore Francis [sic] Williams".[33] Williams inherited Chapman's number 9 on his uniform as opposed to Williams's number 5 in the previous spring training. He made his major league debut against the New York Yankees on April 20,[36] going 1-for-4 against Yankee pitcher Red Ruffing. This was the only game which featured both Williams and Lou Gehrig playing against one another.[37] In his first series at Fenway Park, Williams hit a double, a home run, and a triple, the first two against Cotton Pippen, who gave Williams his first strikeout as a professional while Williams had been in San Diego.[38] By July, Williams was hitting just .280, but leading the league in RBIs.[38] Johnny Orlando, now Williams's friend, then gave Williams a quick pep talk, telling Williams that he should hit .335 with 35 home runs and he would drive in 150 runs. Williams said he would buy Orlando a Cadillac if this all came true.[39] Williams ended up hitting .327 with 31 home runs and 145 RBIs,[36] leading the league in the latter category, the first rookie to lead the league in RBIs[40] and finishing fourth in MVP voting.[41] He also led the AL in walks, with 107, a rookie record. Even though there was not a Rookie of the Year award yet in 1939, Babe Ruth declared Williams to be the Rookie of the Year, which Williams later said was "good enough for me".[42]

Williams's pay doubled in 1940, going from $5,000 to $10,000.[43] A new bullpen was added in right field of Fenway Park, reducing the distance from home plate from 400 feet to 380 feet and earning the nickname "Williamsburg" for being "obviously designed for Williams".[44] Williams was then switched from right field to left field, as there would be less sun in his eyes, and it would give Dom DiMaggio a chance to play center. Finally, Williams was flip-flopped in the order with the great slugger Jimmie Foxx, with the idea that Williams would get more pitches to hit.[44] Pitchers, though, proved willing to pitch around the eagle-eyed Williams in favor of facing the 32-year-old Foxx, the reigning AL home run champion, followed by the still highly productive 33-year-old Joe Cronin, the player-manager.[45] Williams also made his first of 16 All-Star Game appearances[46] in 1940, going 0-for-2.[47] Although Williams hit .344, his power and runs batted in were down from the previous season, with 23 home runs and 113 RBIs.[36] Williams also caused a controversy in mid-August when he called his salary "peanuts", along with saying he hated the city of Boston and reporters, leading reporters to lash back at him, saying that he should be traded.[48] Williams said that the "only real fun" he had in 1940 was being able to pitch once on August 24, when he pitched the last two innings in a 121 loss to the Detroit Tigers, allowing one earned run on three hits, while striking out one batter, Rudy York.[49][50]

In the second week of spring training in 1941, Williams broke a bone in his right ankle, limiting him to pinch hitting for the first two weeks of the season.[51] Bobby Doerr later claimed that the injury would be the foundation of Williams's season, as it forced him to put less pressure on his right foot for the rest of the season.[52] Against the Chicago White Sox on May 7, in extra innings, Williams told the Red Sox pitcher, Charlie Wagner, to hold the White Sox, since he was going to hit a home run. In the 11th inning, Williams's prediction came true, as he hit a big blast to help the Red Sox win. The home run is still considered to be the longest home run ever hit in the old Comiskey Park, some saying that it went 600 feet (180m).[53] Williams's average slowly climbed in the first half of May, and on May 15, he started a 22-game hitting streak. From May 17 to June 1, Williams batted .536, with his season average going above .400 on May 25 and then continuing up to .430.[54] By the All-Star break, Williams was hitting .406 with 62 RBIs and 16 home runs.[55]

In the 1941 All-Star Game, Williams batted fourth behind Joe DiMaggio, who was in the midst of his record-breaking hitting streak, having hit safely in 48 consecutive games.[56] In the fourth inning Williams doubled to drive in a run.[57] With the National League (NL) leading 52 in the eighth inning, Williams struck out in the middle of an American League (AL) rally.[56] In the ninth inning the AL still trailed 53; Ken Keltner and Joe Gordon singled, and Cecil Travis walked to load the bases.[57] DiMaggio grounded to the infield and Billy Herman, attempting to complete a double play, threw wide of first base, allowing Keltner to score.[57] With the score 54 and runners on first and third, Williams homered with his eyes closed to secure a 75 AL win.[57][58] Williams later said that that game-winning home run "remains to this day the most thrilling hit of my life".[59]

In late August, Williams was hitting .402.[59] Williams said that "just about everybody was rooting for me" to hit .400 in the season, including Yankee fans, who gave pitcher Lefty Gomez a "hell of a boo" after walking Williams with the bases loaded after Williams had gotten three straight hits one game in September.[60] In mid-September, Williams was hitting .413, but dropped a point a game from then on.[59] Before the final two games on September 28, a doubleheader against the Philadelphia Athletics, he was batting .39955, which would have been officially rounded up to .400.[59] Red Sox manager Joe Cronin offered him the chance to sit out the final day, but he declined. "If I'm going to be a .400 hitter", he said at the time, "I want more than my toenails on the line."[61] Williams went 6-for-8 on the day, finishing the season at .406.[62] (Sacrifice flies were counted as at-bats in 1941; under today's rules, Williams would have hit between .411 and .419, based on contemporaneous game accounts.[61]) Philadelphia fans ran out on the field to surround Williams after the game, forcing him to protect his hat from being stolen; he was helped into the clubhouse by his teammates.[63] Along with his .406 average, Williams also hit 37 home runs and batted in 120 runs, missing the triple crown by five RBI.[36][61]

Williams's 1941 season is often considered to be the best offensive season of all time, though the MVP award would go to DiMaggio. The .406 batting averagehis first of six batting championshipsis still the highest single-season average in Red Sox history and the highest batting average in the major leagues since 1924, and the last time any major league player has hit over .400 for a season after averaging at least 3.1 plate appearances per game. ("If I had known hitting .400 was going to be such a big deal", he quipped in 1991, "I would have done it again."[61]) Williams's on-base percentage of .553 and slugging percentage of .735 that season are both also the highest single-season averages in Red Sox history. The .553 OBP stood as a major league record until it was broken by Barry Bonds in 2002 and his .735 slugging percentage was the highest mark in the major leagues between 1932 and 1994. His OPS of 1.287 that year, a Red Sox record, was the highest in the major leagues between 1923 and 2001. Despite playing in only 143 games that year, Williams led the league with 135 runs scored and 37 home runs, and he finished third with 335 total bases, the most home runs, runs scored, and total bases by a Red Sox player since Jimmie Foxx's in 1938.[64] Williams placed second in MVP voting; DiMaggio won, 291 votes to 254,[65] on the strength of his record-breaking 56-game hitting streak and league-leading 125 RBI.[62]

In January 1942, just over 2 years after World War II began,[66][67] Williams was drafted into the military, being put into Class 1-A. A friend of Williams suggested that Williams see the advisor of the governor's Selective Service Appeal Agent, since Williams was the sole support of his mother, arguing that Williams should not have been placed in Class 1-A, and said Williams should be reclassified to Class 3-A.[66] Williams was reclassified to 3-A ten days later.[68] Afterwards, the public reaction was extremely negative,[69] even though the baseball book Season of '42 states only four All-Stars and one first-line pitcher entered military service during the 1942 season. (Many more MLB players would enter service during the 1943 season.)[70]

Quaker Oats stopped sponsoring Williams, and Williams, who previously had eaten Quaker products "all the time", never "[ate] one since" the company stopped sponsoring him.[68]

Despite the trouble with the draft board, Williams had a new salary of $30,000 in 1942.[68] In the season, Williams won the Triple Crown,[62] with a .356 batting average, 36 home runs, and 137 RBIs.[36] On May 21, Williams also hit his 100th career home run.[71] He was the third Red Sox player to hit 100 home runs with the team, following his teammates Jimmie Foxx and Joe Cronin.[citation needed] Despite winning the Triple Crown, Williams came in second in the MVP voting, losing to Joe Gordon of the Yankees. Williams felt that he should have gotten a "little more consideration" because of winning the Triple Crown, and he thought that "the reason I didn't get more consideration was because of the trouble I had with the draft [boards]".[62]

Williams joined the Navy Reserve on May 22, 1942, went on active duty in 1943, and was commissioned a second lieutenant in the United States Marine Corps as a Naval Aviator on May 2, 1944. Williams also played on the baseball team in Chapel Hill, North Carolina, along with his Red Sox teammate Johnny Pesky in pre-flight training, after eight weeks in Amherst, Massachusetts, and the Civilian Pilot Training Course.[72] While on the baseball team, Williams was sent back to Fenway Park on July 12, 1943, to play on an All-Star team managed by Babe Ruth. The newspapers reported that Babe Ruth said when finally meeting Williams, "Hiya, kid. You remind me a lot of myself. I love to hit. You're one of the most natural ballplayers I've ever seen. And if my record is broken, I hope you're the one to do it".[73] Williams later said he was "flabbergasted" by the incident, as "after all, it was Babe Ruth".[73] In the game, Williams hit a 425-foot home run to help give the American League All-Stars a 98 win.[74]

On September 2, 1945, when the war ended, Lt. Williams was in Pearl Harbor, Hawaii awaiting orders as a replacement pilot. While in Pearl Harbor, Williams played baseball in the Navy League. Also in that eight-team league were Joe DiMaggio, Joe Gordon, and Stan Musial. The Service World Series with the Army versus the Navy attracted crowds of 40,000 for each game. The players said it was even better than the actual World Series being played between the Detroit Tigers and Chicago Cubs that year.[75]

Williams was discharged by the Marine Corps on January 28, 1946, in time to begin preparations for the upcoming pro baseball season.[76][77] He joined the Red Sox again in 1946, signing a $37,500 contract.[78] On July 14, after Williams hit three home runs and eight RBIs in the first game of a doubleheader, Lou Boudreau, inspired by Williams's consistent pull hitting to right field, created what would later be known as the Boudreau shift (also Williams shift) against Williams, having only one player on the left side of second base (the left fielder). Ignoring the shift, Williams walked twice, doubled, and grounded out to the shortstop, who was positioned in between first and second base.[79][80] Also during 1946, the All-Star Game was held in Fenway Park. In the game, Williams homered in the fourth inning against Kirby Higbe, singled in a run in the fifth inning, singled in the seventh inning, and hit a three-run home run against Rip Sewell's "eephus pitch" in the eighth inning[81] to help the American League win 120.[82]

For the 1946 season, Williams hit .342 with 38 home runs and 123 RBIs,[36] helping the Red Sox win the pennant on September 13. During the season, Williams hit the only inside-the-park home run in his Major League career in a September 10 win at Cleveland,[83][84] and in June hit what is considered the longest home run in Fenway Park history, at 502 feet (153m) and subsequently marked with a lone red seat in the Fenway bleachers.[85] Williams ran away as the winner in the MVP voting.[86] During an exhibition game in Fenway Park against an All-Star team during early October, Williams was hit on the elbow by a curveball by the Washington Senators' pitcher Mickey Haefner. Williams was immediately taken out of the game, and X-rays of his arm showed no damage, but his arm was "swelled up like a boiled egg", according to Williams.[87] Williams could not swing a bat again until four days later, one day before the World Series, when he reported the arm as "sore".[87] During the series, Williams batted .200, going 5-for-25 with no home runs and just one RBI. The Red Sox lost in seven games,[88] with Williams going 0-for-4 in the last game.[89] Fifty years later when asked what one thing he would have done different in his life, Williams replied, "I'd have done better in the '46 World Series. God, I would".[87] The 1946 World Series was the only World Series Williams ever appeared in.[90]

Williams signed a $70,000 contract in 1947.[91] Williams was also almost traded for Joe DiMaggio in 1947. In late April, Red Sox owner Tom Yawkey and Yankees owner Dan Topping agreed to swap the players, but a day later canceled the deal when Yawkey requested that Yogi Berra come with DiMaggio.[92] In May, Williams was hitting .337.[93] Williams won the Triple Crown in 1947, but lost the MVP award to Joe DiMaggio, 202 points to 201 points. One writer left Williams off his ballot. Williams thought it was Mel Webb, whom Williams called a "grouchy old guy",[94] although it now appears it was not Webb.[95]

Williams was the third major league player to have had at least four 30-home run and 100-RBI seasons in their first five years, joining Chuck Klein and Joe DiMaggio, and followed by Ralph Kiner, Mark Teixeira, Albert Pujols, and Ryan Braun through 2011.[96]

In 1948, under their new manager, the ex-New York Yankee great skipper Joe McCarthy,[97] Williams hit a league-leading .369 with 25 home runs and 127 RBIs,[36] and was third in MVP voting.[98] On April 29, Williams hit his 200th career home run. He became just the second player to hit 200 home runs in a Red Sox uniform, joining his former teammate Jimmie Foxx.[64] On October 2, against the Yankees, Williams hit his 222nd career home run, tying Foxx for the Red Sox all-time record.[99] In the Red Sox' final two games of the regular schedule, they beat the Yankees (to force a one-game playoff against the Cleveland Indians) and Williams got on base eight times out of ten plate appearances.[97] In the playoff, Williams went 1-for-4,[100] with the Red Sox losing 83.

In 1949, Williams received a new salary of $100,000 ($1,139,000 in current dollar terms).[101] He hit .343 (losing the AL batting title by just .0002 to the Tigers' George Kell, thus missing the Triple Crown that year), hitting 43 home runs, his career high, and driving in 159 runs, tied for highest in the league, and at one point, he got on base in 84 straight games, an MLB record that still stands today, helping him win the MVP trophy.[36][102] On April 28, Williams hit his 223rd career home run, breaking the record for most home runs in a Red Sox uniform, passing Jimmie Foxx.[103] Williams is still the Red Sox career home run leader.[64] However, despite being ahead of the Yankees by one game just beforea 2-game series against them (last regular-season games for both teams),[97] the Red Sox lost both of those games.[104] The Yankees won the first of what would be five straight World Series titles in 1949.[105] For the rest of Williams's career, the Yankees won nine pennants and six World Series titles, while the Red Sox never finished better than third place.[105]

In 1950, Williams was playing in his eighth All-Star Game. In the first inning, Williams caught a line drive by Ralph Kiner, slamming into the Comiskey Park scoreboard and breaking his left arm.[46] Williams played the rest of the game, and he even singled in a run to give the American League the lead in the fifth inning, but by that time Williams's arm was a "balloon" and he was in great pain, so he left the game.[106] Both of the doctors who X-rayed Williams held little hope for a full recovery. The doctors operated on Williams for two hours.[107] When Williams took his cast off, he could only extend the arm to within four inches of his right arm.[108] Williams only played 89 games in 1950.[36] After the baseball season, Williams's elbow hurt so much he considered retirement, since he thought he would never be able to hit again. Tom Yawkey, the Red Sox owner, then sent Jack Fadden to Williams's Florida home to talk to Williams. Williams later thanked Fadden for saving his career.[109]

In 1951, Williams "struggled" to hit .318, with his elbow still hurting.[110] Williams also played in 148 games, 60 more than Williams had played the previous season, 30 home runs, two more than he had hit in 1950, and 126 RBIs, twenty-nine more than 1950.[36][110] Despite his lower-than-usual production at bat, Williams made the All-Star team.[47] On May 15, 1951, Williams became the 11th player in major league history to hit 300 career home runs. On May 21, Williams passed Chuck Klein for 10th place, on May 25 Williams passed Hornsby for ninth place, and on July 5 Williams passed Al Simmons for eighth place all-time in career home runs.[111] After the season, manager Steve O'Neill was fired, with Lou Boudreau replacing him. Boudreau's first announcement as manager was that all Red Sox players were "expendable", including Williams.[110]

Williams's name was called from a list of inactive reserves to serve on active duty in the Korean War on January 9, 1952. Williams, who was livid at his recalling, had a physical scheduled for April 2.[112] Williams passed his physical and in May, after only playing in six major league games, began refresher flight training and qualification prior to service in Korea. Right before he left for Korea, the Red Sox had a "Ted Williams Day" in Fenway Park. Friends of Williams gave him a Cadillac, and the Red Sox gave Williams a memory book that was signed by 400,000 fans. The governor of Massachusetts and mayor of Boston were there, along with a Korean War veteran named Frederick Wolf who used a wheelchair for mobility.[113] At the end of the ceremony, everyone in the park held hands and sang "Auld Lang Syne" to Williams, a moment which he later said "moved me quite a bit."[114] Private Wolf (an injured Korean veteran from Brooklyn) presented gifts from wounded veterans to Ted Williams. Ted choked and was only able to say,"... ok kid...".[115] The Red Sox went on to win the game 53, thanks to a two-run home run by Williams in the seventh inning.[114]

In August 1953, Williams practiced with the Red Sox for ten days before playing in his first game, garnering a large ovation from the crowd and hitting a home run in the eighth inning.[116] In the season, Williams ended up hitting .407 with 13 home runs and 34 RBIs in 37 games and 110 at bats (not nearly enough plate appearances to qualify for that season's batting title).[36] On September 6, Williams hit his 332nd career home run, passing Hank Greenberg for seventh all-time.[117]

On the first day of spring training in 1954, Williams broke his collarbone running after a line drive.[116] Williams was out for six weeks, and in April he wrote an article with Joe Reichler of the Saturday Evening Post saying that he intended to retire at the end of the season.[118] Williams returned to the Red Sox lineup on May 7, and he hit .345 with 386 at bats in 117 games, although Bobby vila, who had hit .341, won the batting championship. This was because it was required then that a batter needed 400 at bats, despite Lou Boudreau's attempt to bat Williams second in the lineup to get more at-bats. Williams led the league in base on balls with 136 which kept him from qualifying under the rules at the time. By today's standards (plate appearances) he would have been the champion. The rule was changed shortly thereafter to keep this from happening again.[36][119] On August 25, Williams passed Johnny Mize for sixth place, and on September 3, Williams passed Joe DiMaggio for fifth all-time in career home runs with his 362nd career home run. He finished the season with 366 career home runs.[120] On September 26, Williams "retired" after the Red Sox's final game of the season.[121]

During the off-season of 1954, Williams was offered the chance to be manager of the Red Sox. Williams declined, and he suggested that Pinky Higgins, who had previously played on the 1946 Red Sox team as the third baseman, become the manager of the team. Higgins later was hired as the Red Sox manager in 1955.[122] Williams sat out the first month of the 1955 season due to a divorce settlement with his wife, Doris. When Williams returned, he signed a $98,000 contract on May 13. Williams batted .356 in 320 at bats on the season, lacking enough at bats to win the batting title over Al Kaline, who batted .340.[123] Williams hit 28 home runs and drove in 83 runs[36] while being named the "Comeback Player of the Year."[124]

On July 17, 1956, Williams became the fifth player to hit 400 home runs, following Mel Ott in 1941, Jimmie Foxx in 1938, Lou Gehrig in 1936, and Babe Ruth in 1927.[125][126] Three weeks later at home against the Yankees on August7, after Williams was booed for dropping a fly ball from Mickey Mantle, he spat at one of the fans who was taunting him on the top of the dugout;[127] Williams was fined $5,000 for the incident.[128][129] The following night against Baltimore, Williams was greeted by a large ovation, and received an even larger one when he hit a home run in the sixth inning to break a 22 tie. In The Boston Globe, the publishers ran a "What Globe Readers Say About Ted" section made out of letters about Williams, which were either the sportswriters or the "loud mouths" in the stands. Williams explained years later, "From '56 on, I realized that people were for me. The writers had written that the fans should show me they didn't want me, and I got the biggest ovation yet".[130] Williams lost the batting title to Mickey Mantle in 1956, batting .345 to Mantle's .353, with Mantle on his way to winning the Triple Crown.[131]

In 1957, Williams batted .388 to lead the majors, then signed a contract in February 1958 for a record high $125,000 (or $135,000).[132][133] At age forty that season, he again led the American League with a .328 batting average.[134]

When Pumpsie Green became the first black player on the Red Soxthe last major league team to integratein 1959, Williams openly welcomed Green.[135]

Williams ended his career with a home run in his last at-bat on September 28, 1960. He refused to salute the fans as he returned the dugout after he crossed home plate or after he was replaced in left field by Carroll Hardy. An essay written by John Updike the following month for The New Yorker, "Hub Fans Bid Kid Adieu", chronicles this event.[136]

Williams is one of only 29 players in baseball history to date to have appeared in Major League games in four decades.[137]

Williams was an obsessive student of hitting. He famously used a lighter bat than most sluggers, because it generated a faster swing.[138] In 1970, he wrote a book on the subject, The Science of Hitting (revised 1986), which is still read by many baseball players.[138] The book describes his theory of swinging only at pitches that came into ideal areas of his strike zone, a strategy Williams credited with his success as a hitter. Pitchers apparently feared Williams; his bases-on-balls-to-plate-appearances ratio (.2065) is still the highest of any player in the Hall of Fame.

Williams nearly always took the first pitch.[139]

He helped pass his expertise of playing left-field in front of the Green Monster to his successor on the Red Sox, Carl Yastrzemski.[140]

Williams was on uncomfortable terms with the Boston newspapers for nearly twenty years, as he felt they liked to discuss his personal life as much as his baseball performance. He maintained a career-long feud with Sport due to a 1948 feature article in which the reporter included a quote from Williams's mother. Insecure about his upbringing, and stubborn because of immense confidence in his own talent, Williams made up his mind that the "knights of the keyboard", as he derisively labeled the press, were against him. After having hit for the league's Triple Crown in 1947, Williams narrowly lost the MVP award in a vote where one Midwestern newspaper writer left Williams entirely off his ten-player ballot.

During his career, some sportswriters also criticized aspects of Williams's baseball performance, including what they viewed as his lackadaisical fielding and lack of clutch hitting. Williams pushed back, saying: "They're always saying that I don't hit in the clutches. Well, there are a lot [of games] when I do."[141] He also asserted that it made no sense crashing into an outfield wall to try to make a difficult catch because of the risk of injury or being out of position to make the play after missing the ball.[142]

Williams treated most of the press accordingly, as he described in his 1969 memoir My Turn at Bat. Williams also had an uneasy relationship with the Boston fans, though he could be very cordial one-to-one. He felt at times a good deal of gratitude for their passion and their knowledge of the game. On the other hand, Williams was temperamental, high-strung, and at times tactless. In his biography, Ronald Reis relates how Williams committed two fielding miscues in a doubleheader in 1950 and was roundly booed by Boston fans. He bowed three times to various sections of Fenway Park and made an obscene gesture. When he came to bat he spat in the direction of fans near the dugout. The incident caused an avalanche of negative media reaction, and inspired sportswriter Austen Lake's famous comment that when Williams's name was announced the sound was like "autumn wind moaning through an apple orchard."

Another incident occurred in 1958 in a game against the Washington Senators. Williams struck out, and as he stepped from the batter's box swung his bat violently in anger. The bat slipped from his hands, was launched into the stands and struck a 60-year-old woman who turned out to be the housekeeper of the Red Sox general manager Joe Cronin. While the incident was an accident and Williams apologized to the woman personally, to all appearances it seemed at the time that Williams had hurled the bat in a fit of temper.

Williams gave generously to those in need. He was especially linked with the Jimmy Fund of the DanaFarber Cancer Institute, which provides support for children's cancer research and treatment. Williams used his celebrity to virtually launch the fund, which raised more than $750million between 1948 and 2010. Throughout his career, Williams made countless bedside visits to children being treated for cancer, which Williams insisted go unreported. Often parents of sick children would learn at check-out time that "Mr. Williams has taken care of your bill".[143] The Fund recently stated that "Williams would travel everywhere and anywhere, no strings or paychecks attached, to support the cause... His name is synonymous with our battle against all forms of cancer."[143]

Williams demanded loyalty from those around him. He could not forgive the fickle nature of the fansbooing a player for booting a ground ball, and then turning around and roaring approval of the same player for hitting a home run. Despite the cheers and adulation of most of his fans, the occasional boos directed at him in Fenway Park led Williams to stop tipping his cap in acknowledgment after a home run.

Williams maintained this policy up to and including his swan song in 1960. After hitting a home run at Fenway Park, which would be his last career at-bat, Williams characteristically refused either to tip his cap as he circled the bases or to respond to prolonged cheers of "We want Ted!" from the crowd by making an appearance from the dugout. The Boston manager Pinky Higgins sent Williams to his fielding position in left field to start the ninth inning, but then immediately recalled him for his back-up Carroll Hardy, thus allowing Williams to receive one last ovation as he jogged onto then off the field, and he did so without reacting to the crowd. Williams's aloof attitude led the writer John Updike to observe wryly that "Gods do not answer letters."[136]

Williams's final home run did not take place during the final game of the 1960 season, but rather in the Red Sox's last home game that year. The Red Sox played three more games, but they were on the road in New York City and Williams did not appear in any of them, as it became clear that Williams's final home at-bat would be the last one of his career.

In 1991, on Ted Williams Day at Fenway Park, Williams pulled a Red Sox cap from out of his jacket and tipped it to the crowd. This was the first time that he had done so since his earliest days as a player.

A Red Smith profile from 1956 describes one Boston writer trying to convince Ted Williams that first cheering and then booing a ballplayer was no different from a moviegoer applauding a "western" movie actor one day and saying the next "He stinks! Whatever gave me the idea he could act?" Williams rejected this; when he liked a western actor like Hoot Gibson, he liked him in every picture, and would not think of booing him.

Williams once had a friendship with Ty Cobb, with whom he often had discussions about baseball. He often touted Rogers Hornsby as being the greatest right-handed hitter of all time. This assertion actually led to a split in the relationship between Ty Cobb and Ted Williams. Once during one of their yearly debate sessions on the greatest hitters of all time, Williams asserted that Hornsby was one of the greatest of all time. Cobb apparently had strong feelings about Hornsby and he threw a fit, expelling Williams from his hotel room. Their friendship effectively terminated after this altercation.[144] This story was later refuted by Ted Williams himself.[145]

Williams served as a Naval Aviator during World War II and the Korean War. Unlike many other major league players, he did not spend all of his war-time playing on service teams.[146] Williams had been classified 3-A by Selective Service prior to the war, a dependency deferment because he was his mother's sole means of financial support. When his classification was changed to 1-A following the American entry into World War II, Williams appealed to his local draft board. The draft board ruled that his draft status should not have been changed. He made a public statement that once he had built up his mother's trust fund, he intended to enlist. Even so, criticism in the media, including withdrawal of an endorsement contract by Quaker Oats, resulted in his enlistment in the U.S. Naval Reserve on May 22, 1942.

Williams did not opt for an easy assignment playing baseball for the Navy, but rather joined the V-5 program to become a Naval aviator. Williams was first sent to the Navy's Preliminary Ground School at Amherst College for six months of academic instruction in various subjects including math and navigation, where he achieved a 3.85 grade point average.

Williams was talented as a pilot, and so enjoyed it that he had to be ordered by the Navy to leave training to personally accept his American League 1942 Major League Baseball Triple Crown.[146] Williams's Red Sox teammate, Johnny Pesky, who went into the same aviation training program, said this about Williams: "He mastered intricate problems in fifteen minutes which took the average cadet an hour, and half of the other cadets there were college grads." Pesky again described Williams's acumen in the advance training, for which Pesky personally did not qualify: "I heard Ted literally tore the sleeve target to shreds with his angle dives. He'd shoot from wingovers, zooms, and barrel rolls, and after a few passes the sleeve was ribbons. At any rate, I know he broke the all-time record for hits." Ted went to Jacksonville for a course in aerial gunnery, the combat pilot's payoff test, and broke all the records in reflexes, coordination, and visual-reaction time. "From what I heard. Ted could make a plane and its six 'pianos' (machine guns) play like a symphony orchestra", Pesky says. "From what they said, his reflexes, coordination, and visual reaction made him a built-in part of the machine."[147]

Williams completed pre-flight training in Athens, Georgia, his primary training at NAS Bunker Hill, Indiana, and his advanced flight training at NAS Pensacola. He received his gold Naval Aviator wings and his commission as a second lieutenant in the U.S. Marine Corps on May 2, 1944.

Williams served as a flight instructor at NAS Pensacola teaching young pilots to fly the complicated F4U Corsair fighter plane. Williams was in Pearl Harbor awaiting orders to join the Fleet in the Western Pacific when the War in the Pacific ended. He finished the war in Hawaii, and then he was released from active duty on January 12, 1946, but he did remain in the Marine Corps Reserve.[77]

On May 1, 1952, 14 months after his promotion to captain in the Marine Corps Reserve, Williams was recalled to active duty for service in the Korean War.[148] He had not flown any aircraft for eight years but he turned down all offers to sit out the war in comfort as a member of a service baseball team. Nevertheless, Williams was resentful of being called up, which he admitted years later, particularly regarding the Navy's policy of calling up Inactive Reservists rather than members of the Active Reserve.

Williams reported for duty on May 2, 1952. After eight weeks of refresher flight training and qualification in the F9F Panther jet fighter with VMF-223 at the Marine Corps Air Station Cherry Point, North Carolina, Williams was assigned to VMF-311, Marine Aircraft Group 33 (MAG-33), based at the K-3 airfield in Pohang, South Korea.[77]

On February 16, 1953, Williams, flying as the wingman for John Glenn (later an astronaut, then U.S. Senator), was part of a 35-plane raid against a tank and infantry training school just south of Pyongyang, North Korea. As the aircraft from VMF-115 and VMF-311 dove on the target, Williams's plane was hit by anti-aircraft fire, a piece of flak knocked out his hydraulics and electrical systems, causing Williams to have to "limp" his plane back to K-3 air base where he made a belly landing. For his actions of this day, he was awarded the Air Medal.[149]

Williams flew 39 combat missions in Korea, earning the Air Medal with two Gold Stars representing second and third awards, before being withdrawn from flight status in June 1953 after a hospitalization for pneumonia. This resulted in the discovery of an inner ear infection that disqualified him from flight status.[150] John Glenn described Williams as one of the best pilots he knew,[146] while his wife Annie described him as the most profane man she ever met.[151] In the last half of his missions, Williams was flying as Glenn's wingman.[152]

Williams likely would have exceeded 600 career home runs if he had not served in the military, and might even have approached Babe Ruth's then record of 714. He might have set the record for career RBIs as well, exceeding Hank Aaron's total.[146] While the absences in the Marine Corps took almost five years out of his baseball career, he never publicly complained about the time devoted to service in the Marine Corps. His biographer, Leigh Montville, argued that Williams was not happy about being pressed into service in South Korea, but he did what he thought was his patriotic duty.

Following his return to the United States in August 1953, he resigned his Reserve commission to resume his baseball career.[148]

After retirement from play, Williams helped Boston's new left fielder, Carl Yastrzemski, in hitting, and was a regular visitor to the Red Sox' spring training camps from 1961 to 1966, where he worked as a special batting instructor. He served as executive assistant to Tom Yawkey (196165), then was named a team vice president (196568) upon his election to the Hall of Fame. He resumed his spring training instruction role with the club in 1978.

Beginning in 1961, he would spend summers at the Ted Williams Baseball Camp in Lakeville, Massachusetts, which he had established in 1958 with his friend Al Cassidy and two other business partners. For eight summers and parts of others after that, he would give hitting clinics and talk baseball at the camp.[5] It was not uncommon to find Williams fishing in the pond at the camp. The area now is owned by the town and a few of the buildings still stand. In the main lodge one can still see memorabilia from Williams's playing days.

Williams served as manager of the Washington Senators, from 19691971, then continued with the team when they became the Texas Rangers after the 1971 season. Williams's best season as a manager was 1969 when he led the expansion Senators to an 8676 record in the team's only winning season in Washington. He was chosen "Manager of the Year" after that season. Like many great players, Williams became impatient with ordinary athletes' abilities and attitudes, particularly those of pitchers, whom he admitted he never respected. Fellow manager Alvin Dark thought Williams "was a smart, fearless manager" who helped his hitters perform better. Williams's issue with Washington/Texas, according to Dark, was when the ownership traded away his third baseman and shortstop, making it difficult for the club to be as competitive.[153]

On the subject of pitchers, in Ted's autobiography written with John Underwood, Ted opines regarding Bob Lemon (a sinker-ball specialist) pitching for the Cleveland Indians around 1951: "I have to rate Lemon as one of the very best pitchers I ever faced. His ball was always moving, hard, sinking, fast-breaking. You could never really uhmmmph with Lemon."

Williams was much more successful in fishing. An avid and expert fly fisherman and deep-sea fisherman, he spent many summers after baseball fishing the Miramichi River, in Miramichi, New Brunswick. Williams was named to the International Game Fish Association Hall of Fame in 2000. Williams, Jim Brown, Cumberland Posey, and Cal Hubbard are the only athletes to be inducted into the Halls of Fame of more than one professional sport. Williams was also known as an accomplished hunter; he was fond of pigeon-shooting for sport in Fenway Park during his career, on one occasion drawing the ire of the Massachusetts Society for the Prevention of Cruelty to Animals.[154]

Williams reached an extensive deal with Sears, lending his name and talent toward marketing, developing, and endorsing a line of in-house sports equipmentsuch as the "Ted Williams" edition Gamefisher aluminum boat and 7.5hp "Ted Williams" edition motor, as well as fishing, hunting, and baseball equipment. Williams continued his involvement in the Jimmy Fund, later losing a brother to leukemia, and spending much of his spare time, effort, and money in support of the cancer organization.

In his later years Williams became a fixture at autograph shows and card shows after his son (by his third wife), John Henry Williams, took control of his career, becoming his de facto manager. The younger Williams provided structure to his father's business affairs, exposed forgeries that were flooding the memorabilia market, and rationed his father's public appearances and memorabilia signings to maximize their earnings.

One of Ted Williams's final, and most memorable, public appearances was at the 1999 All-Star Game in Boston. Able to walk only a short distance, Williams was brought to the pitcher's mound in a golf cart. He proudly waved his cap to the crowda gesture he had never done as a player. Fans responded with a standing ovation that lasted several minutes. At the pitcher's mound he was surrounded by players from both teams, including fellow Red Sox player Nomar Garciaparra, and was assisted by Tony Gwynn in throwing out the first pitch of that year's All-Star Game. Later in the year, he was among the members of the Major League Baseball All-Century Team introduced to the crowd at Turner Field in Atlanta prior to Game Two of the World Series.

On May 4, 1944, Williams married Doris Soule, the daughter of his hunting guide. Their daughter, Barbara Joyce ("Bobbi Jo"), was born on January 28, 1948, while Williams was fishing in Florida.[155] They divorced in 1954. Williams married the socialite model Lee Howard on September 10, 1961, and they were divorced in 1967.

Williams married Dolores Wettach, a former Miss Vermont and Vogue model, in 1968. Their son John-Henry was born on August 27, 1968, followed by daughter Claudia, on October 8, 1971. They were divorced in 1972.[156]

Williams lived with Louise Kaufman for twenty years until her death in 1993. In his book, Cramer called her the love of Williams's life.[157] After his death, her sons filed suit to recover her furniture from Williams's condominium as well as a half-interest in the condominium they claimed he gave her.[158]

Williams had a strong respect for General Douglas MacArthur, referring to him as his "idol".[159] For Williams's 40th birthday, MacArthur sent him an oil painting of himself with the inscription "To Ted Williamsnot only America's greatest baseball player, but a great American who served his country. Your friend, Douglas MacArthur. General U.S. Army."[160]

Politically, Williams was a Republican,[161] and was described by one biographer as, "to the right of Attila the Hun" except when it came to Civil Rights.[162] Another writer similarly noted that while in the 1960s he had a liberal attitude on civil rights, he was pretty far right on other cultural issues of the time, calling him ultraconservative in the tradition of Barry Goldwater and John Wayne.[161]

Williams campaigned for Richard Nixon in the 1960 United States Presidential Election, and after Nixon lost to John F. Kennedy, refused several invitations from President Kennedy to gather together in Cape Cod. He supported Nixon again in 1968, and as manager of the Senators, kept a picture of him on his desk, meeting with the President several times while managing the team. In 1972 he called Nixon, the greatest president of my lifetime.[161] In the following years, Williams endorsed several other candidates in Republican Party presidential primaries, including George H. W. Bush in 1988 (whom he also campaigned for in New Hampshire),[163] Bob Dole in 1996, and George W. Bush in 2000.[164]

According to friends, Williams was an atheist[165] and this influenced his decision to be cryogenically frozen. His daughter Claudia stated "It was like a religion, something we could have faith in... no different from holding the belief that you might be reunited with your loved ones in heaven".[166]

Williams's brother Danny and his son John-Henry both died of leukemia.[167]

In his last years, Williams suffered from cardiomyopathy. He had a pacemaker implanted in November 2000 and he underwent open-heart surgery in January 2001. After suffering a series of strokes and congestive heart failure, he died of cardiac arrest at the age of 83 on July 5, 2002, at Citrus Memorial Hospital, Inverness, Florida, near his home in Citrus Hills, Florida.[168]

Though his will stated his desire to be cremated and his ashes scattered in the Florida Keys, Williams' son John-Henry and younger daughter Claudia chose to have his remains frozen cryonically.

Ted's elder daughter, Bobby-Jo Ferrell, brought a suit to have her father's wishes recognized. John-Henry's lawyer then produced an informal "family pact" signed by Ted, Claudia, and John-Henry, in which they agreed "to be put into biostasis after we die" to "be able to be together in the future, even if it is only a chance."[169] Bobby-Jo and her attorney, Spike Fitzpatrick (former attorney of Ted Williams), contended that the family pact, which was scribbled on an ink-stained napkin, was forged by John-Henry and/or Claudia.[170] Fitzpatrick and Ferrell believed that the signature was not obtained legally.[171] Laboratory analysis proved that the signature was genuine.[171] John-Henry said that his father was a believer in science and was willing to try cryonics if it held the possibility of reuniting the family.[172]

Though the family pact upset some friends, family and fans, a public plea for financial support of the lawsuit by Ferrell produced little result.[172] Citing financial difficulties, Ferrell dropped her lawsuit on the condition that a $645,000 trust fund left by Williams would immediately pay the sum out equally to the three children.[172] Inquiries to cryonics organizations increased after the publicity from the case.[170]

In Ted Williams: The Biography of an American Hero, author Leigh Montville claims that the family cryonics pact was a practice Ted Williams autograph on a plain piece of paper, around which the agreement had later been hand written. The pact document was signed "Ted Williams", the same as his autographs, whereas he would always sign his legal documents "Theodore Williams", according to Montville. However, Claudia testified to the authenticity of the document in an affidavit.[173]

Williams body was subsequently decapitated for the neuropreservation option from Alcor.[174] Following John-Henry's unexpected illness and death from acute myeloid leukemia on March 6, 2004, John-Henry's body was also transported to Alcor, in fulfillment of the family agreement.[175]

In 1954, Williams was inducted by the San Diego Hall of Champions into the Breitbard Hall of Fame honoring San Diego's finest athletes both on and off the playing surface.[176]

Williams was inducted into the Baseball Hall of Fame on July 25, 1966.[177] In his induction speech, Williams included a statement calling for the recognition of the great Negro leagues players: "I've been a very lucky guy to have worn a baseball uniform, and I hope some day the names of Satchel Paige and Josh Gibson in some way can be added as a symbol of the great Negro players who are not here only because they weren't given a chance."[178] Williams was referring to two of the most famous names in the Negro leagues, who were not given the opportunity to play in the Major Leagues before Jackie Robinson broke the color barrier in 1947. Gibson died early in 1947 and thus never played in the majors; and Paige's brief major league stint came long past his prime as a player. This powerful and unprecedented statement from the Hall of Fame podium was "a first crack in the door that ultimately would open and include Paige and Gibson and other Negro league stars in the shrine."[178] Paige was the first inducted in 1971. Gibson and others followed, starting in 1972 and continued on and off into the 21st century.

On November 18, 1991, President George H. W. Bush presented Williams with the Presidential Medal of Freedom, the highest civilian award in the US.[179]

The Ted Williams Tunnel in Boston, Massachusetts, carrying 1.6 miles (2.6km) of the final 2.3 miles (3.7km) of Interstate 90 under Boston Harbor, opened in December 1995, and Ted Williams Parkway (California State Route 56) in San Diego County, California, opened in 1992, were named in his honor while he was still alive. In 2016, the major league San Diego Padres inducted Williams into their hall of fame for his contributions to baseball in San Diego.[180]

The Tampa Bay Rays home field, Tropicana Field, installed the Ted Williams Museum (formerly in Hernando, Florida, 19942006) behind the left field fence. From the Tampa Bay Rays website: "The Ted Williams Museum and Hitters Hall of Fame brings a special element to the Tropicana Field. Fans can view an array of different artifacts and pictures of the 'Greatest hitter that ever lived.' These memorable displays range from Ted Williams's days in the military through his professional playing career. This museum is dedicated to some of the greatest players to ever 'lace 'em up,' including Willie Mays, Joe DiMaggio, Mickey Mantle, Roger Maris."

In 2013, the Bob Feller Act of Valor Award honored Williams as one of 37 Baseball Hall of Fame members for his service in the United States Marine Corps during World War II.[181]

At the time of his retirement, Williams ranked third all-time in home runs (behind Babe Ruth and Jimmie Foxx), seventh in RBIs (after Ruth, Cap Anson, Lou Gehrig, Ty Cobb, Foxx, and Mel Ott), and seventh in batting average (behind Cobb, Rogers Hornsby, Shoeless Joe Jackson, Lefty O'Doul, Ed Delahanty and Tris Speaker). His career batting average of .3444 is the highest of any player who played his entire career in the live-ball era following 1920.

Most modern statistical analyses[which?] place Williams, along with Ruth and Barry Bonds, among the three most potent hitters to have played the game. Williams's baseball season of 1941 is often considered favorably with the greatest seasons of Ruth and Bonds in terms of various offensive statistical measures such as slugging, on-base and "offensive winning percentage." As a further indication, of the ten best seasons for OPS, short for On-Base Plus Slugging Percentage, a popular modern measure of offensive productivity, four each were achieved by Ruth and Bonds, and two by Williams.

In 1999, Williams was ranked as number eight on The Sporting News' list of the 100 Greatest Baseball Players, where he was the highest-ranking left fielder.[182]

Read the original here:

Ted Williams - Wikipedia

Why the sci-fi dream of cryonics never died | MIT Technology Review

The environment was something of a shift for Drake, who had spent the previous seven years as the medical response director of the Alcor Life Extension Foundation. Though it was the longtime leader in cryonics, Alcor was still a small nonprofit. It had been freezing the bodies and brains of its members, with the idea of one day bringing them back to life, since 1976.

The foundation, and cryonics in general, had long survived outside of mainstream acceptance. Typically shunned by the scientific community, cryonics is best known for its appearance in sci-fi films like 2001: A Space Odyssey. But its adherents have held on to a dream that at some point in the future, advances in medicine will allow for resuscitation and additional years on Earth. Over decades, small, tantalizing developments in related technology, as well as high-profile frozen test subjects like Ted Williams, have kept the hope alive. Today, nearly 200 dead patients are frozen in Alcors cryogenic chambers at temperatures of 196 C, including a handful of celebrities, who have paid tens of thousands of dollars for the goal of possible revival and ultimately reintegration into society.

But its the recent involvement of Yinfeng that signals something of a new era for cryonics. With impressive financial resources, government support, and scientific staff, its one of a handful of new labs focused on expanding the consumer appeal of cryonics and trying anew to bring credibility to the long-disputed theory of human reanimation. Just a year after Drake came on board as research director of the Shandong Yinfeng Life Science Research Institute, the subsidiary of the Yinfeng Biological Group overseeing the cryonics program, the institute performed its first cryopreservation. Its storage vats now hold about a dozen clients who are paying upwards of $200,000 to preserve the whole body.

Still, the field remains rooted in faith rather than any real evidence that it works. Its a hopeless aspiration that reveals an appalling ignorance of biology, says Clive Coen, a neuroscientist and professor at Kings College London.

Even if one day you could perfectly thaw a frozen human body, you would still just have a warm dead body on your hands.

The cryonics process typically goes something like this: Upon a persons death, a response team begins the process of cooling the corpse to a low temperature and performs cardiopulmonary support to sustain blood flow to the brain and organs. Then the body is moved to a cryonics facility, where an organ preservation solution is pumped through the veins before the body is submerged in liquid nitrogen. This process should commence within one hour of deaththe longer the wait, the greater the damage to the bodys cells. Then, once the frozen cadaver is ensconced in the cryogenic chamber, the hope of the dead begins.

Since its beginnings in the late 1960s, the field has attracted opprobrium from the scientific community, particularly its more respectable cousin cryobiologythe study of how freezing and low temperatures affect living organisms and biological materials. The Society for Cryobiology even banned its members from involvement in cryonics in the 1980s, with a former society president lambasting the field as closer to fraud than either faith or science.

In recent years, though, it has grabbed the attention of the libertarian techno-optimist crowd, mostly tech moguls dreaming of their own immortality. And a number of new startups are expanding the playing field. Tomorrow Biostasis in Berlin became the first cryonics company in Western Europe in 2019, for example, and in early 2022, Southern Cryonics opened a facility in Australia.

More researchers are open to longer-term, futuristic topics than there might have been 20 years ago or so, says Tomorrow Biostasis founder Emil Kendziorra.

Visit link:

Why the sci-fi dream of cryonics never died | MIT Technology Review

Arizona cryonics facility preserves bodies to revive later

SCOTTSDALE, Ariz., Oct 12 (Reuters) - Time and death are "on pause" for some people in Scottsdale, Arizona.

Inside tanks filled with liquid nitrogen are the bodies and heads of 199 humans who opted to be cryopreserved in hopes of being revived in the future when science has advanced beyond what it is capable of today. Many of the "patients," as Alcor Life Extension Foundation calls them, were terminally ill with cancer, ALS or other diseases with no present-day cure.

Matheryn Naovaratpong, a Thai girl with brain cancer, is the youngest person to be cryopreserved, at the age of 2 in 2015.

"Both her parents were doctors and she had multiple brain surgeries and nothing worked, unfortunately. So they contacted us," said Max More, chief executive of Alcor, a nonprofit which claims to be the world leader in cryonics.

Bitcoin pioneer Hal Finney, another Alcor patient, had his body cryopreserved after death from ALS in 2014.

The cryopreservation process begins after a person is declared legally dead. Blood and other fluids are removed from the patient's body and replaced with chemicals designed to prevent the formation of damaging ice crystals. Vitrified at extremely cold temperatures, Alcor patients are then placed in tanks at the Arizona facility "for as long as it takes for technology to catch up," More said.

The minimum cost is $200,000 for a body and $80,000 for the brain alone. Most of Alcor's almost 1,400 living "members" pay by making the company the beneficiary of life insurance policies equal to the cost, More said.

More's wife Natasha Vita-More likens the process to taking a trip to the future.

"The disease or injury cured or fixed, and the person has a new body cloned or a whole body prosthetic or their body reanimated and (can) meet up with their friends again," she said.

Many medical professionals disagree, said Arthur Caplan, who heads the medical ethics division at New York University's Grossman School of Medicine.

"This notion of freezing ourselves into the future is pretty science fiction and it's naive," he said. "The only group... getting excited about the possibility are people who specialize in studying the distant future or people who have a stake in wanting you to pay the money to do it."

Reporting by Liliana Salgado; Editing by Richard Chang

Our Standards: The Thomson Reuters Trust Principles.

See more here:

Arizona cryonics facility preserves bodies to revive later

James Webb Discovery Helps Date Birth of Very First Galaxies – TIME

  1. James Webb Discovery Helps Date Birth of Very First Galaxies  TIME
  2. Discovering rare red spiral galaxy population from early universe with the James Webb Space Telescope  Science Daily
  3. NASA's James Webb Space Telescope discovers furthest galaxy  Cosmos
  4. Oldest known galaxies spotted by James Webb Space Telescope  Fox News
  5. James Webb Space Telescope 'fingerprints' earliest galaxies  BBC
  6. View Full Coverage on Google News

Originally posted here:

James Webb Discovery Helps Date Birth of Very First Galaxies - TIME

Roger de Montgomery – Wikipedia

11th-century Norman nobleman and earl in England

Roger de Montgomery (died 1094), also known as Roger the Great, was the first Earl of Shrewsbury, and Earl of Arundel, in Sussex. His father was Roger de Montgomery, seigneur of Montgomery, a member of the House of Montgomerie, and was probably a grandnephew of the Duchess Gunnor, wife of Duke Richard I of Normandy, the great-grandfather of William the Conqueror. The elder Roger had large landholdings in central Normandy, chiefly in the valley of the River Dives, which the younger Roger inherited.

Roger inherited his fathers estates in 1055. By the time of the Council of Lillebonne, which took place in about January of 1066, he was one of William the Conqueror's principal counsellors, playing a major role at the Council. He may not have fought in the initial invasion of England in 1066, instead staying behind to help govern Normandy. According to Wace's Roman de Rou, however, he commanded the Norman right flank at Hastings, returning to Normandy with King William in 1067.[1] Afterwards, he was entrusted with land in two regions critical for the defence of the Kingdom of England. At the end of 1067 or early in 1068, William gave Roger nearly all of what is now the county of West Sussex, a total of 83 manors,[2] which at the time of the Domesday Survey (1086) was an area known as the Rape of Arundel; and about 1071 Roger was granted estates in Shropshire[3] which amounted to some seven-eighths of the whole county;[2] he was also made Earl of Shrewsbury, but it is uncertain that the earldom came to him at the same time as the land, and it may have been a few years later. In 1083, Roger founded Shrewsbury Abbey.[3]

Roger was thus one of the half dozen greatest magnates in England during William the Conqueror's reign.[4] The Rape of Arundel was eventually split into two "rapes", one keeping the name of Arundel, the other being called the Rape of Chichester.[4]

Besides his estates in Sussex and Shropshire, Roger had others in Surrey (four manors), Hampshire (nine manors), Wiltshire (three manors), Middlesex (eight manors), Gloucestershire (one manor), Worcestershire (two manors), Cambridgeshire (eight manors), Warwickshire (eleven manors), and Staffordshire (thirty manors).[2] The income from Roger's estates amounted to about 2,000 per year, and in 1086 the income of all the land in England was around 72,000. The 2,000 (equivalent to several million in 2022) was almost 3 per cent of the nation's GDP.[5][6]

After William I's death in 1087, Roger joined with other rebels to overthrow the newly crowned king, William II, in the Rebellion of 1088. However, William was able to convince Roger to abandon the rebellion and to side with him. This worked out favourably for Roger, as the rebels were beaten and lost their land holdings in England. [7]

Roger married Mabel de Bellme, who was heiress to a large territory straddling the border between Normandy and Maine. The medieval chronicler Orderic Vitalis paints a picture of Mabel of Bellme being a scheming and cruel woman.[8] She was murdered by Hugh Bunel and his brothers who, possibly in December 1077, rode into her castle of Bures-sur-Dive and cut off her head as she lay in bed.[8][9] Their motive for the murder was that Mabel had deprived them of their paternal inheritance.[10] Roger and Mabel had 10 children:

Roger then married Adelaide du Puiset, by whom he had one son, Everard, who entered the Church.

After his death, Roger's estates were divided.[19] His eldest surviving son, Robert of Bellme, received the bulk of the Norman estates (as well as his mother's estates); the next son, Hugh, received the bulk of the English estates and the Earldom of Shrewsbury.[19] After Hugh's death, the elder son Robert inherited the earldom.[19]

Read the rest here:

Roger de Montgomery - Wikipedia

Video games can never be art | Roger Ebert | Roger Ebert

What stirs me to return to the subject? I was urged by a reader, Mark Johns, to consider a video of a TED talk given at USC by Kellee Santiago, a designer and producer of video games. I did so. I warmed to Santiago immediately. She is bright, confident, persuasive. But she is mistaken.

I propose to take an unfair advantage. She spoke extemporaneously. I have the luxury of responding after consideration. If you want to follow along, I urge you to watch her talk, which is embedded below. It's only 15 minutes long, and she makes the time pass quickly.

She begins by saying video games "already ARE art." Yet she concedes that I was correct when I wrote, "No one in or out of the field has ever been able to cite a game worthy of comparison with the great poets, filmmakers, novelists and poets." To which I could have added painters, composers, and so on, but my point is clear.

Then she shows a slide of a prehistoric cave painting, calling it "kind of chicken scratches on walls," and contrasts it with Michelangelo's ceiling of the Sistine Chapel. Her point is that while video games may be closer to the chicken scratch end of the spectrum, I am foolish to assume they will not evolve.

She then says speech began as a form of warning, and writing as a form of bookkeeping, but they evolved into storytelling and song. Actually, speech probably evolved into a form of storytelling and song long before writing was developed. And cave paintings were a form of storytelling, perhaps of religion, and certainly of the creation of beauty from those chicken-scratches Werner Herzog is even now filming in 3-D.

Herzog believes, in fact, that the paintings on the wall of the Cave of Chauvet-Pont-d'Arc in Southern France should only be looked at in the context of the shadows cast on those dark walls by the fires built behind the artists, which suggests the cave paintings, their materials of charcoal and ochre and all that went into them were the fruition of a long gestation, not the beginning of something--and that the artists were enormously gifted. They were great artists at that time, geniuses with nothing to build on, and were not in the process of becoming Michelangelo or anyone else. Any gifted artist will tell you how much he admires the "line" of those prehistoric drawers in the dark, and with what economy and wit they evoked the animals they lived among.

Read the rest here:

Video games can never be art | Roger Ebert | Roger Ebert

Life extension – Wikipedia

Concept of extending human lifespan by improvements in medicine or biotechnology

Life extension is the concept of extending the human lifespan, either modestly through improvements in medicine or dramatically by increasing the maximum lifespan beyond its generally-settled limit of 125 years.[1]

Several researchers in the area, along with "life extensionists", "immortalists" or "longevists" (those who wish to achieve longer lives themselves), postulate that future breakthroughs in tissue rejuvenation, stem cells, regenerative medicine, molecular repair, gene therapy, pharmaceuticals and organ replacement (such as with artificial organs or xenotransplantations) will eventually enable humans to have indefinite lifespans (agerasia[2]) through complete rejuvenation to a healthy youthful condition. The ethical ramifications, if life extension becomes a possibility, are debated by bioethicists.

The sale of purported anti-aging products such as supplements and hormone replacement is a lucrative global industry. For example, the industry that promotes the use of hormones as a treatment for consumers to slow or reverse the aging process in the US market generated about $50billion of revenue a year in 2009.[3] The use of such hormone products, however, has not been proven to be effective or safe.[3][4][5][6]

During the process of aging, an organism accumulates damage to its macromolecules, cells, tissues, and organs. Specifically, aging is characterized as and thought to be caused by "genomic instability, telomere attrition, epigenetic alterations, loss of proteostasis, deregulated nutrient sensing, mitochondrial dysfunction, cellular senescence, stem cell exhaustion, and altered intercellular communication."[7] Oxidation damage to cellular contents caused by free radicals is believed to contribute to aging as well.[8][9]

The longest documented human lifespan is 122 years 164 days, the case of Jeanne Calment who according to records was born in 1875 and died in 1997, whereas the maximum lifespan of a wildtype mouse, commonly used as a model in research on aging, is about three years.[10] Genetic differences between humans and mice that may account for these different aging rates include differences in efficiency of DNA repair, antioxidant defenses, energy metabolism, proteostasis maintenance, and recycling mechanisms such as autophagy.[11]

The average lifespan in a population is lowered by infant and child mortality, which are frequently linked to infectious diseases or nutrition problems. Later in life, vulnerability to accidents and age-related chronic disease such as cancer or cardiovascular disease play an increasing role in mortality. Extension of expected lifespan can often be achieved by access to improved medical care, vaccinations, good diet, exercise and avoidance of hazards such as smoking.

Maximum lifespan is determined by the rate of aging for a species inherent in its genes and by environmental factors. Widely recognized methods of extending maximum lifespan in model organisms such as nematodes, fruit flies, and mice include caloric restriction, gene manipulation, and administration of pharmaceuticals.[12] Another technique uses evolutionary pressures such as breeding from only older members or altering levels of extrinsic mortality.[13][14]Some animals such as hydra, planarian flatworms, and certain sponges, corals, and jellyfish do not die of old age and exhibit potential immortality.[15][16][17][18]

Senolytics eliminate senescent cells whereas senomorphics with candidates such as Apigenin, Everolimus and Rapamycin modulate properties of senescent cells without eliminating them, suppressing phenotypes of senescence, including the SASP.[22][23] Senomorphic effects may be one major effect mechanism of a range of prolongevity drug candidates. Such candidates are however typically not studied for just one mechanism, but multiple. There are biological databases of prolongevity drug candidates under research as well as of potential gene/protein targets. These are enhanced by longitudinal cohort studies, electronic health records, computational (drug) screening methods, computational biomarker-discovery methods and computational biodata-interpretation/personalized medicine methods.[24][25][26]

Such strategies as well as testing with model organisms and xenografts may attempt to or help address difficulties of trials with humans which have relatively long lifespans (compared to other animals) as well as the (larger) need to protect human health from early-trial-stage interventions (in clinical trials).

Besides rapamycin and senolytics, the drug-repurposing candidates studied most extensively include metformin, acarbose, spermidine (see below) and NAD+ enhancers.[27]

Many prolongevity drugs are synthetic alternatives or potential complements to existing nutraceuticals, such as various sirtuin-activating compounds under investigation like SRT2104.[28] In some cases pharmaceutical administration is combined with that of neutraceuticals such as in the case of glycine combined with NAC.[29] Often studies are strucutured based on or thematize specific prolongevity targets, listing both nutraceuticals and pharmaceuticals (together or separately) such as FOXO3-activators.[30]

Researchers are also exploring ways to mitigate side-effects from such substances (possibly most notably rapamycin and its derivatives) such as via protocols of intermittent administration[31][23][22][32][33] and have called for research that helps determine optimal treatment schedules (including timing) in general.[34]

The free-radical theory of aging suggests that antioxidant supplements might extend human life. Reviews, however, have found that use of vitamin A (as -carotene) and vitamin E supplements possibly can increase mortality.[35][36] Other reviews have found no relationship between vitamin E and other vitamins with mortality.[37] Vitamin D supplementation of various dosages is investigated in trials[38] and there also is research into GlyNAC (see above).[29]

Complications of antioxidant supplementation (especially continuous high dosages far above the RDA) include that reactive oxygen species (ROS), which are mitigated by antioxidants, "have been found to be physiologically vital for signal transduction, gene regulation, and redox regulation, among others, implying that their complete elimination would be harmful". In particular, one way of multiple they can be detrimental is by inhibiting adaptation to exercise such as muscle hypertrophy (e.g. during dedicated periods of caloric surplus).[39][40][41] There is also research into stimulating/activating/fueling endogenous antioxidant generation, in particular e.g. of neutraceutical glycine and pharmaceutical NAC.[42] Antioxidants can change the oxidation status of different e.g. tissues, targets or sites each with potentially different implications, especially for different concentrations.[43][44][45][additional citation(s) needed] A review suggests mitochondria have a hormetic response to ROS, whereby low oxidative damage can be beneficial.[46]

In some studies calorie restriction has been shown to extend the life of mice, yeast, and rhesus monkeys.[47][48] However, a more recent study did not find calorie restriction to improve survival in rhesus monkeys.[49] In humans the long-term health effects of moderate caloric restriction with sufficient nutrients are unknown.[50]

According to two scientific reviews published in 2021, accumulating data suggests dietary restriction (DR) mainly intermittent fasting and caloric restriction results in many of the same beneficial changes in adult humans as in studied organisms, potentially increasing health- and lifespan beyond[51] the benefits of healthy body weight.[51][52][53][54][55][56]

Which protocols of and combinations (e.g. see caloric restriction mimetic and AMPK) with DR are effective or most effective in humans is largely unknown and is being actively researched. A geroscience field of "precision nutrigeroscience" is proposed that also considers the potential need for adjustments of nutritional interventions per individual (e.g. due to differences in genetics and age).[53] Intermittent fasting refers to periods with intervals during which no food but only water and tea/coffee (the latter reduces appetite or facilitates caloric restriction and also activates autophagy)[57][58][59][60] are ingested such as a period of daily time-restricted eating with a window of 8 to 12 hours for any caloric intake and could be combined for synergistic effects with overall caloric restriction and variants of the Mediterranean diet which usually has benefits of long-term cardiovascular health and longevity.[61]

CALERIE is a trial of prolonged continuous calorie restriction on healthy humans.[62]

Specific amino acids in the protein consumption are associated with the regulation of lifespan/ageing and their targeted restriction has been proposed for further research.[63][64][65][66]

Mechanistically, research to date has identified various nutrient sensors involved in the beneficial effects of caloric restriction as well as methionine-reduction/restriction including notably AMPK[67] (see also: mTOR inhibitors), mTOR, insulin-related pathways, sirtuins,[68][69] NAD+,[70] NFkB, and FOXO and, partly by extension, processes such as DNA repair[71][68] and autophagy.[72][73][74][51][34][71]

During periods of caloric restriction, higher protein intakes "may be required to maximize muscle retention in lean, resistance-trained subjects"[75] and "resistance training (RT) can attenuate muscle loss during caloric restriction"[76] with strength training also generally being associated with a "1017% lower risk of all-cause mortality, cardiovascular disease (CVD), total cancer, diabetes and lung cancer".[77] Reviews have clarified that the permanent or periodic caloric restriction is conducted in such a way that no malnutrition occurs (see below).[78][54][52]

Research suggests that increasing adherence to Mediterranean diet patterns is associated with a reduction in total and cause-specific mortality, extending health- and lifespan.[79][80][74][81] Research is identifying the key beneficial components of the Mediterranean diet.[82][83] It shares various characteristics with the similarly beneficial Okinawa diet.[84] Potential anti-aging mechanisms of various nutrients are not yet understood.[85] Shares of macronutrients[86][52] and level of caloric intake may also be of significance, including in periods when no dietary restriction occurs[86] such as not having a fat-intake that is too low[52] and not having a prolonged caloric surplus or caloric deficit that is too large.

Studies suggest dietary changes are a major cause of national relative rises in life-span.[87]

Mechanistically, research suggests that the gut microbiome, which varies per person and changes throughout lifespan, is also involved in the beneficial effects, due to which various diet supplementations with prebiotics, various diverse (multi-strain) probiotics and synbiotics, and fecal microbiota transplantation are being investigated for life extension,[88][26][89] mainly for prolonging healthspan,[90][91][92] with many important questions being unresolved.[93]

Approaches to develop optimal diets for health- and lifespan (or "longevity diets")[52] include:

Beyond, research into senolytics and (synthetic) prolongevity-drugs, vitamins and antioxidants, prebiotics and probiotics, there are neutraceuticals dietary supplements and bioactive plant compounds (phytochemicals) but not pharmaceuticals[120] that are being investigated in life sciences, nutrition science and gerontology for potential health- and lifespan extension in healthy humans. Sometimes, their use is researched or recommended as a way to correct nutritional deficiencies from switching to otherwise healthy foods in particular from replacing meat consumption with a higher intake of plant-based foods.[97][121] Especially, but not only, in such cases the supplementation of minerals and various specific micronutrients is investigated. Correcting magnesium deficiency for instance could prolong life.[122] Many supplements are researched primarily for potential improvements in health and healthspan rather than for extending lifespan.

Some studies hypothesize that relative health and longevity benefits of various foods and diets can be largely or to a large part attributed to the nutraceuticals they contain.[83][123][104][124] Some studies suggest increasing the intake of specific foods (see above) based on such results, while some investigate supplementation, including of dosages that are impractical to achieve with whole foods.

Researched substances include various polyphenols such as pterostilbene[125][126][124][69] or flavonoids, notably epicatechin.[85] Some herbal-extracts like rhodiola rosea are also being investigated due to results of tests with model organisms.[127][128] Some of these are AMPK activators and hence caloric restriction mimetics (some possibly exercise mimetics[129] as well). AMPK activators include resveratrol[130][125][131][104][123][132] and berberine.[133][134][132] Many such nutraceuticals are also potent antioxidants.[123][135] Like prolongevity-drugs and bioactive compounds in general, they can have multiple potential effect mechanisms, the polyphenol resveratrol for instance also activates possibly pro-longevity sirtuin activity.[68][123]

A common issue with many already-existing natural nutraceuticals like resveratrol is their low bioavailability.[136][137] Their side-effects are often low compared to several major longevity drug candidates. On the other hand, they are considered to often have "intrinsic natural bio-compatibility and safety".[69] Some of the compounds can have a "biphasic dose response" (a trait/effect of hormesis) whereby they (can) have beneficial effects at low or moderate doses and toxic effects at high doses.[34]

Further advanced biosciences-based approaches include:

There is a need and research into the development of aging biomarkers such as the epigenetic clock "to assess the ageing process and the efficacy of interventions to bypass the need for large-scale longitudinal studies".[62][25] Such biomarkers may also include in vivo brain imaging.[152]

Reviews sometimes include structured tables that provide systematic overviews of intervention/drug candidates with a review calling for integrating "current knowledge with multi-omics, health records, and drug safety data to predict drugs that can improve health in late life" and listing major outstanding questions.[24] Biological databases of prolongevity drug candidates under research as well as of potential gene/protein targets include GenAge, DrugAge and Geroprotectors.[24][153]

A review has pointed out that the approach of "'epidemiological' comparison of how a low versus a high consumption of an isolated macronutrient and its association with health and mortality may not only fail to identify protective or detrimental nutrition patterns but may lead to misleading interpretations". It proposes a multi-pillar approach, and summarizes findings towards constructing multi-system-considering and at least age-personalized dynamic refined longevity diets. Epidemiological-type observational studies included in meta-analyses should according to the study at least be complemented by "(1) basic research focused on lifespan and healthspan, (2) carefully controlled clinical trials, and (3) studies of individuals and populations with record longevity".[52]

The anti-aging industry offers several hormone therapies. Some of these have been criticized for possible dangers and a lack of proven effect. For example, the American Medical Association has been critical of some anti-aging hormone therapies.[3]

While growth hormone (GH) decreases with age, the evidence for use of growth hormone as an anti-aging therapy is mixed and based mostly on animal studies. There are mixed reports that GH or IGF-1 modulates the aging process in humans and about whether the direction of its effect is positive or negative.[154]

Klotho[142][155] and exerkines[145] (see above) like irisin[156] are being investigated for potential pro-longevity therapies.

Loneliness/isolation, social life and support,[81][157] exercise/physical activity (partly via neurobiological effects and increased NAD+ levels),[81][158][62][66][159][160] psychological characteristics/personality (possibly highly indirectly),[161][162] sleep duration,[81] circadian rhythms (patterns of sleep, drug-administration and feeding),[163][164][165] type of leisure activities,[81] not smoking,[81] altruistic emotions and behaviors,[166][167] subjective well-being,[168] mood[81] and stress (including via heat shock protein)[81][169] are investigated as potential (modulatable) factors of life extension.

Healthy lifestyle practices and healthy diet have been suggested as "first-line function-preserving strategies, with pharmacological agents, including existing and new pharmaceuticals and novel 'nutraceutical' compounds, serving as potential complementary approaches".[170]

Collectively, addressing common causes of death could extend lifespans of populations and humanity overall. For instance, a 2020 study indicates that the global mean loss of life expectancy (LLE) from air pollution in 2015 was 2.9 years, substantially more than, for example, 0.3years from all forms of direct violence, albeit a significant fraction of the LLE (a measure similar to years of potential life lost) is considered to be unavoidable.[172]

Regular screening and doctor visits has been suggested as a lifestyle-societal intervention.[81] (See also: medical test and biomarker)

Health policy and changes to standard healthcare could support the adoption of the field's conclusions a review suggests that the longevity diet would be a "valuable complement to standard healthcare and that, taken as a preventative measure, it could aid in avoiding morbidity, sustaining health into advanced age" as a form of preventive healthcare.[52]

It has been suggested that in terms of healthy diets, Mediterranean-style diets could be promoted by countries for ensuring healthy-by-default choices ("to ensure the healthiest choice is the easiest choice") and with highly effective measures including dietary education, food checklists and recipes that are "simple, palatable, and affordable".[173]

A review suggests that "targeting the aging process per se may be a far more effective approach to prevent or delay aging-associated pathologies than treatments specifically targeted to particular clinical conditions".[174]

Low ambient temperature as a physical factor affecting free radical levels was identified as a treatment producing exceptional lifespan increase in Drosophila melanogaster and other living beings. [175]

The extension of life has been a desire of humanity and a mainstay motif in the history of scientific pursuits and ideas throughout history, from the Sumerian Epic of Gilgamesh and the Egyptian Smith medical papyrus, all the way through the Taoists, Ayurveda practitioners, alchemists, hygienists such as Luigi Cornaro, Johann Cohausen and Christoph Wilhelm Hufeland, and philosophers such as Francis Bacon, Ren Descartes, Benjamin Franklin and Nicolas Condorcet. However, the beginning of the modern period in this endeavor can be traced to the end of the 19th beginning of the 20th century, to the so-called "fin-de-sicle" (end of the century) period, denoted as an "end of an epoch" and characterized by the rise of scientific optimism and therapeutic activism, entailing the pursuit of life extension (or life-extensionism). Among the foremost researchers of life extension at this period were the Nobel Prize winning biologist Elie Metchnikoff (1845-1916) -- the author of the cell theory of immunity and vice director of Institut Pasteur in Paris, and Charles-douard Brown-Squard (1817-1894) -- the president of the French Biological Society and one of the founders of modern endocrinology.[176]

Sociologist James Hughes claims that science has been tied to a cultural narrative of conquering death since the Age of Enlightenment. He cites Francis Bacon (15611626) as an advocate of using science and reason to extend human life, noting Bacon's novel New Atlantis, wherein scientists worked toward delaying aging and prolonging life. Robert Boyle (16271691), founding member of the Royal Society, also hoped that science would make substantial progress with life extension, according to Hughes, and proposed such experiments as "to replace the blood of the old with the blood of the young". Biologist Alexis Carrel (18731944) was inspired by a belief in indefinite human lifespan that he developed after experimenting with cells, says Hughes.[177]

Regulatory and legal struggles between the Food and Drug Administration (FDA) and the Life Extension organization included seizure of merchandise and court action.[178] In 1991, Saul Kent and Bill Faloon, the principals of the organization, were jailed for four hours and were released on $850,000 bond each.[179] After 11 years of legal battles, Kent and Faloon convinced the US Attorney's Office to dismiss all criminal indictments brought against them by the FDA.[180]

In 2003, Doubleday published "The Immortal Cell: One Scientist's Quest to Solve the Mystery of Human Aging," by Michael D. West. West emphasised the potential role of embryonic stem cells in life extension.[181]

Other modern life extensionists include writer Gennady Stolyarov, who insists that death is "the enemy of us all, to be fought with medicine, science, and technology";[182] transhumanist philosopher Zoltan Istvan, who proposes that the "transhumanist must safeguard one's own existence above all else";[183] futurist George Dvorsky, who considers aging to be a problem that desperately needs to be solved;[184] and recording artist Steve Aoki, who has been called "one of the most prolific campaigners for life extension".[185]

In 1991, the American Academy of Anti-Aging Medicine (A4M) was formed. The American Board of Medical Specialties recognizes neither anti-aging medicine nor the A4M's professional standing.[186]

In 2003, Aubrey de Grey and David Gobel formed the Methuselah Foundation, which gives financial grants to anti-aging research projects. In 2009, de Grey and several others founded the SENS Research Foundation, a California-based scientific research organization which conducts research into aging and funds other anti-aging research projects at various universities.[187] In 2013, Google announced Calico, a new company based in San Francisco that will harness new technologies to increase scientific understanding of the biology of aging.[188] It is led by Arthur D. Levinson,[189] and its research team includes scientists such as Hal V. Barron, David Botstein, and Cynthia Kenyon. In 2014, biologist Craig Venter founded Human Longevity Inc., a company dedicated to scientific research to end aging through genomics and cell therapy. They received funding with the goal of compiling a comprehensive human genotype, microbiome, and phenotype database.[190]

Aside from private initiatives, aging research is being conducted in university laboratories, and includes universities such as Harvard and UCLA. University researchers have made a number of breakthroughs in extending the lives of mice and insects by reversing certain aspects of aging.[191][192][193][194]

Some critics dispute the portrayal of aging as a disease. For example, Leonard Hayflick, who determined that fibroblasts are limited to around 50cell divisions, reasons that aging is an unavoidable consequence of entropy. Hayflick and fellow biogerontologists Jay Olshansky and Bruce Carnes have strongly criticized the anti-aging industry in response to what they see as unscrupulous profiteering from the sale of unproven anti-aging supplements.[5]

Research by Sobh and Martin (2011) suggests that people buy anti-aging products to obtain a hoped-for self (e.g., keeping a youthful skin) or to avoid a feared-self (e.g., looking old). The research shows that when consumers pursue a hoped-for self, it is expectations of success that most strongly drive their motivation to use the product. The research also shows why doing badly when trying to avoid a feared self is more motivating than doing well. When product use is seen to fail it is more motivating than success when consumers seek to avoid a feared-self.[195]

Though many scientists state[196] that life extension and radical life extension are possible, there are still no international or national programs focused on radical life extension. There are political forces staying for and against life extension. By 2012, in Russia, the United States, Israel, and the Netherlands, the Longevity political parties started. They aimed to provide political support to radical life extension research and technologies, and ensure the fastest possible and at the same time soft transition of society to the next step life without aging and with radical life extension, and to provide access to such technologies to most currently living people.[197]

Some tech innovators and Silicon Valley entrepreneurs have invested heavily into anti-aging research. This includes Jeff Bezos (founder of Amazon), Larry Ellison (founder of Oracle), Peter Thiel (former PayPal CEO),[198] Larry Page (co-founder of Google), and Peter Diamandis.[199]

Leon Kass (chairman of the US President's Council on Bioethics from 2001 to 2005) has questioned whether potential exacerbation of overpopulation problems would make life extension unethical.[200] He states his opposition to life extension with the words:

"simply to covet a prolonged life span for ourselves is both a sign and a cause of our failure to open ourselves to procreation and to any higher purpose ... [The] desire to prolong youthfulness is not only a childish desire to eat one's life and keep it; it is also an expression of a childish and narcissistic wish incompatible with devotion to posterity."[201]

John Harris, former editor-in-chief of the Journal of Medical Ethics, argues that as long as life is worth living, according to the person himself, we have a powerful moral imperative to save the life and thus to develop and offer life extension therapies to those who want them.[202]

Transhumanist philosopher Nick Bostrom has argued that any technological advances in life extension must be equitably distributed and not restricted to a privileged few.[203] In an extended metaphor entitled "The Fable of the Dragon-Tyrant", Bostrom envisions death as a monstrous dragon who demands human sacrifices. In the fable, after a lengthy debate between those who believe the dragon is a fact of life and those who believe the dragon can and should be destroyed, the dragon is finally killed. Bostrom argues that political inaction allowed many preventable human deaths to occur.[204]

Controversy about life extension is due to fear of overpopulation and possible effects on society.[205] Biogerontologist Aubrey De Grey counters the overpopulation critique by pointing out that the therapy could postpone or eliminate menopause, allowing women to space out their pregnancies over more years and thus decreasing the yearly population growth rate.[206] Moreover, the philosopher and futurist Max More argues that, given the fact the worldwide population growth rate is slowing down and is projected to eventually stabilize and begin falling, superlongevity would be unlikely to contribute to overpopulation.[205]

A Spring 2013 Pew Research poll in the United States found that 38% of Americans would want life extension treatments, and 56% would reject it. However, it also found that 68% believed most people would want it and that only 4% consider an "ideal lifespan" to be more than 120 years. The median "ideal lifespan" was 91 years of age and the majority of the public (63%) viewed medical advances aimed at prolonging life as generally good. 41% of Americans believed that radical life extension (RLE) would be good for society, while 51% said they believed it would be bad for society.[207] One possibility for why 56% of Americans claim they would reject life extension treatments may be due to the cultural perception that living longer would result in a longer period of decrepitude, and that the elderly in our current society are unhealthy.[208]

Religious people are no more likely to oppose life extension than the unaffiliated,[207] though some variation exists between religious denominations.

Mainstream medical organizations and practitioners do not consider aging to be a disease. Biologist David Sinclair says: "Idon't see aging as a disease, but as a collection of quite predictable diseases caused by the deterioration of the body".[209] The two main arguments used are that aging is both inevitable and universal while diseases are not.[210] However, not everyone agrees. Harry R. Moody, director of academic affairs for AARP, notes that what is normal and what is disease strongly depend on a historical context.[211] David Gems, assistant director of the Institute of Healthy Ageing, argues that aging should be viewed as a disease.[212] In response to the universality of aging, David Gems notes that it is as misleading as arguing that Basenji are not dogs because they do not bark.[213] Because of the universality of aging he calls it a "special sort of disease". Robert M. Perlman, coined the terms "aging syndrome" and "disease complex" in 1954 to describe aging.[214]

The discussion whether aging should be viewed as a disease or not has important implications. One view is, this would stimulate pharmaceutical companies to develop life extension therapies and in the United States of America, it would also increase the regulation of the anti-aging market by the Food and Drug Administration (FDA). Anti-aging now falls under the regulations for cosmetic medicine which are less tight than those for drugs.[213][215]

Theoretically, extension of maximum lifespan in humans could be achieved by reducing the rate of aging damage by periodic replacement of damaged tissues, molecular repair or rejuvenation of deteriorated cells and tissues, reversal of harmful epigenetic changes, or the enhancement of enzyme telomerase activity.[216][217]

Research geared towards life extension strategies in various organisms is currently under way at a number of academic and private institutions. Since 2009, investigators have found ways to increase the lifespan of nematode worms and yeast by 10-fold; the record in nematodes was achieved through genetic engineering and the extension in yeast by a combination of genetic engineering and caloric restriction.[218] A 2009 review of longevity research noted: "Extrapolation from worms to mammals is risky at best, and it cannot be assumed that interventions will result in comparable life extension factors. Longevity gains from dietary restriction, or from mutations studied previously, yield smaller benefits to Drosophila than to nematodes, and smaller still to mammals. This is not unexpected, since mammals have evolved to live many times the worm's lifespan, and humans live nearly twice as long as the next longest-lived primate. From an evolutionary perspective, mammals and their ancestors have already undergone several hundred million years of natural selection favoring traits that could directly or indirectly favor increased longevity, and may thus have already settled on gene sequences that promote lifespan. Moreover, the very notion of a "life-extension factor" that could apply across taxa presumes a linear response rarely seen in biology."[218]

There are a number of chemicals intended to slow the aging process currently being studied in animal models.[219] One type of research is related to the observed effects of a calorie restriction (CR) diet, which has been shown to extend lifespan in some animals.[220] Based on that research, there have been attempts to develop drugs that will have the same effect on the aging process as a caloric restriction diet, which are known as caloric restriction mimetic drugs. Some drugs that are already approved for other uses have been studied for possible longevity effects on laboratory animals because of a possible CR-mimic effect; they include rapamycin for mTOR inhibition[221] and metformin for AMPK activation.[222]

Sirtuin activating polyphenols, such as resveratrol and pterostilbene,[223][224][225] and flavonoids, such as quercetin and fisetin,[226] as well as oleic acid[227] are dietary supplements that have also been studied in this context. Other popular supplements with less clear biological pathways to target aging include, lipoic acid,[228] senolytics such as curcumin,[226] and Coenzyme Q10.[229] Daily low doses of ethanol as a potential supplement in spite of its highly negative hormesis response at higher doses has also been studied.[230]

Other attempts to create anti-aging drugs have taken different research paths. One notable direction of research has been research into the possibility of using the enzyme telomerase in order to counter the process of telomere shortening.[231] However, there are potential dangers in this, since some research has also linked telomerase to cancer and to tumor growth and formation.[232]

Future advances in nanomedicine could give rise to life extension through the repair of many processes thought to be responsible for aging. K. Eric Drexler, one of the founders of nanotechnology, postulated cell repair machines, including ones operating within cells and utilizing as yet hypothetical molecular computers, in his 1986 book Engines of Creation. Raymond Kurzweil, a futurist and transhumanist, stated in his book The Singularity Is Near that he believes that advanced medical nanorobotics could completely remedy the effects of aging by 2030.[233] According to Richard Feynman, it was his former graduate student and collaborator Albert Hibbs who originally suggested to him (circa 1959) the idea of a medical use for Feynman's theoretical nanomachines (see biological machine). Hibbs suggested that certain repair machines might one day be reduced in size to the point that it would, in theory, be possible to (as Feynman put it) "swallow the doctor". The idea was incorporated into Feynman's 1959 essay There's Plenty of Room at the Bottom.[234]

Some life extensionists suggest that therapeutic cloning and stem cell research could one day provide a way to generate cells, body parts, or even entire bodies (generally referred to as reproductive cloning) that would be genetically identical to a prospective patient. Recently, the US Department of Defense initiated a program to research the possibility of growing human body parts on mice.[235] Complex biological structures, such as mammalian joints and limbs, have not yet been replicated. Dog and primate brain transplantation experiments were conducted in the mid-20th century but failed due to rejection and the inability to restore nerve connections. As of 2006, the implantation of bio-engineered bladders grown from patients' own cells has proven to be a viable treatment for bladder disease.[236] Proponents of body part replacement and cloning contend that the required biotechnologies are likely to appear earlier than other life-extension technologies.

The use of human stem cells, particularly embryonic stem cells, is controversial. Opponents' objections generally are based on interpretations of religious teachings or ethical considerations.[citation needed] Proponents of stem cell research point out that cells are routinely formed and destroyed in a variety of contexts. Use of stem cells taken from the umbilical cord or parts of the adult body may not provoke controversy.[237]

The controversies over cloning are similar, except general public opinion in most countries stands in opposition to reproductive cloning. Some proponents of therapeutic cloning predict the production of whole bodies, lacking consciousness, for eventual brain transplantation.

Replacement of biological (susceptible to diseases) organs with mechanical ones could extend life. This is the goal of the 2045 Initiative.[238]

Cryonics is the low-temperature freezing (usually at 196C or 320.8F or 77.1K) of a human corpse, with the hope that resuscitation may be possible in the future.[239][240] It is regarded with skepticism within the mainstream scientific community and has been characterized as quackery.[241]

Another proposed life extension technology aims to combine existing and predicted future biochemical and genetic techniques. SENS proposes that rejuvenation may be obtained by removing aging damage via the use of stem cells and tissue engineering, telomere-lengthening machinery, allotopic expression of mitochondrial proteins, targeted ablation of cells, immunotherapeutic clearance, and novel lysosomal hydrolases.[242]

While some biogerontologists find these ideas "worthy of discussion",[243][244] others contend that the alleged benefits are too speculative given the current state of technology, referring to it as "fantasy rather than science".[4][6]

Genome editing, in which nucleic acid polymers are delivered as a drug and are either expressed as proteins, interfere with the expression of proteins, or correct genetic mutations, has been proposed as a future strategy to prevent aging.[245][246]

A large array of genetic modifications have been found to increase lifespan in model organisms such as yeast, nematode worms, fruit flies, and mice. As of 2013, the longest extension of life caused by a single gene manipulation was roughly 50% in mice and 10-fold in nematode worms.[247]

In July 2020 scientists, using public biological data on 1.75 m people with known lifespans overall, identify 10 genomic loci which appear to intrinsically influence healthspan, lifespan, and longevity of which half have not been reported previously at genome-wide significance and most being associated with cardiovascular disease and identify haem metabolism as a promising candidate for further research within the field. Their study suggests that high levels of iron in the blood likely reduce, and genes involved in metabolising iron likely increase healthy years of life in humans.[249][248] The same month other scientists report that yeast cells of the same genetic material and within the same environment age in two distinct ways, describe a biomolecular mechanism that can determine which process dominates during aging and genetically engineer a novel aging route with substantially extended lifespan.[250][251]

In The Selfish Gene, Richard Dawkins describes an approach to life-extension that involves "fooling genes" into thinking the body is young.[252] Dawkins attributes inspiration for this idea to Peter Medawar. The basic idea is that our bodies are composed of genes that activate throughout our lifetimes, some when we are young and others when we are older. Presumably, these genes are activated by environmental factors, and the changes caused by these genes activating can be lethal. It is a statistical certainty that we possess more lethal genes that activate in later life than in early life. Therefore, to extend life, we should be able to prevent these genes from switching on, and we should be able to do so by "identifying changes in the internal chemical environment of a body that take place during aging... and by simulating the superficial chemical properties of a young body".[253]

One hypothetical future strategy that, as some suggest,[who?] "eliminates" the complications related to a physical body, involves the copying or transferring (e.g. by progressively replacing neurons with transistors) of a conscious mind from a biological brain to a non-biological computer system or computational device. The basic idea is to scan the structure of a particular brain in detail, and then construct a software model of it that is so faithful to the original that, when run on appropriate hardware, it will behave in essentially the same way as the original brain.[254] Whether or not an exact copy of one's mind constitutes actual life extension is matter of debate.

However, critics argue that the uploaded mind would simply be a clone and not a true continuation of a person's consciousness.[255]

Some scientists believe that the dead may one day be "resurrected" through simulation technology.[256]

Some clinics currently offer injection of blood products from young donors. The alleged benefits of the treatment, none of which have been demonstrated in a proper study, include a longer life, darker hair, better memory, better sleep, curing heart diseases, diabetes and Alzheimer's disease.[257][258][259][260][261] The approach is based on parabiosis studies such as those Irina Conboy has done on mice, but Conboy says young blood does not reverse aging (even in mice) and that those who offer those treatments have misunderstood her research.[258][259] Neuroscientist Tony Wyss-Coray, who also studied blood exchanges on mice as recently as 2014, said people offering those treatments are "basically abusing people's trust"[262][259] and that young blood treatments are "the scientific equivalent of fake news".[263] The treatment appeared in HBO's Silicon Valley fiction series.[262]

Two clinics in California, run by Jesse Karmazin and David C. Wright,[257] offer $8,000 injections of plasma extracted from the blood of young people. Karmazin has not published in any peer-reviewed journal and his current study does not use a control group.[263][262][257][259]

Fecal microbiota transplantation[264][265] and probiotics are being investigated as means for life and healthspan extension.[266][267][268]

Read more here:

Life extension - Wikipedia

N. Katherine Hayles – Wikipedia

American literary critic

Nancy Katherine Hayles (born December 16, 1943) is an American postmodern literary critic, most notable for her contribution to the fields of literature and science, electronic literature, and American literature.[1] She is the James B. Duke Distinguished Professor Emerita of Literature, Literature, Trinity College of Arts & Sciences at Duke University.[2]

Hayles was born in Saint Louis, Missouri to Edward and Thelma Bruns. She received her B.S. in chemistry from the Rochester Institute of Technology in 1966, and her M.S. in chemistry from the California Institute of Technology in 1969. She worked as a research chemist in 1966 at Xerox Corporation and as a chemical research consultant Beckman Instrument Company from 1968 to 1970. Hayles then switched fields and received her M.A. in English literature from Michigan State University in 1970, and her Ph.D. in English literature from the University of Rochester in 1977.[3] She is a social and literary critic.

Her scholarship primarily focuses on the "relations between science, literature, and technology."[4][5] Hayles has taught at UCLA, University of Iowa, University of MissouriRolla, the California Institute of Technology, and Dartmouth College.[3] She was the faculty director of the Electronic Literature Organization from 2001 to 2006.[6]

From 2008 to 2018, she was a professor of English and Literature at Duke University. As of 2018, Hayles was the James B. Duke Distinguished Professor Emerita of Literature, Literature, Trinity College of Arts & Sciences at Duke University.[7]

Hayles understands "human" and "posthuman" as constructions that emerge from historically specific understandings of technology, culture and embodiment; "human and "posthuman" views each produce unique models of subjectivity.[8] Within this framework "human" is aligned with Enlightenment notions of liberal humanism, including its emphasis on the "natural self" and the freedom of the individual.[9] Conversely, posthuman does away with the notion of a "natural" self and emerges when human intelligence is conceptualized as being co-produced with intelligent machines. According to Hayles the posthuman view privileges information over materiality, considers consciousness as an epiphenomenon and imagines the body as a prosthesis for the mind.[10] Specifically Hayles suggests that in the posthuman view "there are no essential differences or absolute demarcations between bodily existence and computer simulation..."[9] The posthuman thus emerges as a deconstruction of the liberal humanist notion of "human." Hayles disregards the idea of a form of immortality created through the preservation of human knowledge with computers, instead opting for a specification within the definition of posthuman that one embraces the possibilities of information technology without the imagined concepts of infinite power and immortality, tropes often associated with technology and dissociated with traditional humanity. This idea of the posthuman also ties in with cybernetics in the creation of the feedback loop that allows humans to interact with technology through a blackbox, linking the human and the machine as one. Thus, Hayles links this to an overall cultural perception of virtuality and a priority on information rather than materiality.

Despite drawing out the differences between "human" and "posthuman", Hayles is careful to note that both perspectives engage in the erasure of embodiment from subjectivity.[11] In the liberal humanist view, cognition takes precedence over the body, which is narrated as an object to possess and master. Meanwhile, popular conceptions of the cybernetic posthuman imagine the body as merely a container for information and code. Noting the alignment between these two perspectives, Hayles uses How We Became Posthuman to investigate the social and cultural processes and practices that led to the conceptualization of information as separate from the material that instantiates it.[12] Drawing on diverse examples, such as Turing's imitation game, Gibson's Neuromancer and cybernetic theory, Hayles traces the history of what she calls "the cultural perception that information and materiality are conceptually distinct and that information is in some sense more essential, more important and more fundamental than materiality."[13] By tracing the emergence of such thinking, and by looking at the manner in which literary and scientific texts came to imagine, for example, the possibility of downloading human consciousness into a computer, Hayles attempts to trouble the information/material separation and in her words, "...put back into the picture the flesh that continues to be erased in contemporary discussions about cybernetic subjects.[14] In this regard, the posthuman subject under the condition of virtuality is an "amalgam, a collection of heterogeneous components, a material-informational entity whose boundaries undergo continuous construction and reconstruction."[15] Hayles differentiates "embodiment" from the concept of "the body" because "in contrast to the body, embodiment is contextual, enmeshed within the specifics of place, time, physiology, and culture, which together compose enactment."[16] Hayles specifically examines how various science fiction novels portray a shift in the conception of information, particularly in the dialectics of presence/absence toward pattern/randomness. She diagrams these shifts to show how ideas about abstraction and information actually have a "local habitation" and are "embodied" within the narratives. Although ideas about "information" taken out of context creates abstractions about the human "body", reading science fiction situates these same ideas in "embodied" narrative."

According to Hayles, most human cognition happens outside of consciousness/unconsciousness; cognition extends through the entire biological spectrum, including animals and plants; technical devices cognize, and in doing so profoundly influence human complex systems.[17][18] Hayles makes a distinction between thinking and cognition. In Unthought: the power of the cognitive nonconscious, she describes thinking:

"Thinking, as I use the term, refers to high-level mental operations such as reasoning abstractly, creating and using verbal languages, constructing mathematical theorems, composing music, and the like, operations associated with higher consciousness."[19]

She describes cognition:

"Cognition is a much broader capacity that extends far beyond consciousness into other neurological brain processes; it is also pervasive in other life forms and complex technical systems. Although the cognitive capacity that exists beyond consciousness goes by various names, I call it nonconscious cognition."[20]

Within the field of Posthuman Studies, Hayles' How We Became Posthuman is considered "the key text which brought posthumanism to broad international attention".[21] In the years since this book was published, it has been both praised and critiqued by scholars who have viewed her work through a variety of lenses; including those of cybernetic history, feminism, postmodernism, cultural and literary criticism, and conversations in the popular press about humans' changing relationships to technology.

Reactions to Hayles' writing style, general organization, and scope of the book have been mixed. The book is generally praised for displaying depth and scope in its combining of scientific ideas and literary criticism. Linda Brigham of Kansas State University claims that Hayles manages to lead the text "across diverse, historically contentious terrain by means of a carefully crafted and deliberate organizational structure."[22] Some scholars found her prose difficult to read or over-complicated. Andrew Pickering describes the book as "hard going" and lacking of "straightforward presentation."[23] Dennis Weiss of York College of Pennsylvania accuses Hayles of "unnecessarily complicat[ing] her framework for thinking about the body", for example by using terms such as "body" and "embodiment" ambiguously. Weiss however acknowledges as convincing her use of science fiction in order to reveal how "the narrowly focused, abstract constellation of ideas" of cybernetics circulate through a broader cultural context.[24] Craig Keating of Langara College on the contrary argues that the obscurity of some texts questions their ability to function as the conduit for scientific ideas.[25]

Several scholars reviewing How We Became Posthuman highlighted the strengths and shortcomings of her book vis a vis its relationship to feminism. Amelia Jones of University of Southern California describes Hayles' work as reacting to the misogynistic discourse of the field of cybernetics.[26] As Pickering wrote, Hayles' promotion of an "embodied posthumanism" challenges cybernetics' "equation of human-ness with disembodied information" for being "another male trick to feminists tired of the devaluation of women's bodily labor."[23] Stephanie Turner of Purdue University also described Hayles' work as an opportunity to challenge prevailing concepts of the human subject which assumed the body was white, male, and European, but suggested Hayles' dialectic method may have taken too many interpretive risks, leaving some questions open about "which interventions promise the best directions to take."[27]

Reviewers were mixed about Hayles' construction of the posthuman subject. Weiss describes Hayles' work as challenging the simplistic dichotomy of human and post-human subjects in order to "rethink the relationship between human beings and intelligent machines," however suggests that in her attempt to set her vision of the posthuman apart from the "realist, objectivist epistemology characteristic of first-wave cybernetics", she too, falls back on universalist discourse, premised this time on how cognitive science is able to reveal the "true nature of the self."[24] Jones similarly described Hayles' work as reacting to cybernetics' disembodiment of the human subject by swinging too far towards an insistence on a "physical reality" of the body apart from discourse. Jones argued that reality is rather "determined in and through the way we view, articulate, and understand the world".[26]

In terms of the strength of Hayles' arguments regarding the return of materiality to information, several scholars expressed doubt on the validity of the provided grounds, notably evolutionary psychology. Keating claims that while Hayles is following evolutionary psychological arguments in order to argue for the overcoming of the disembodiment of knowledge, she provides "no good reason to support this proposition."[25] Brigham describes Hayles' attempt to connect autopoietic circularity to "an inadequacy in Maturana's attempt to account for evolutionary change" as unjustified.[22] Weiss suggests that she makes the mistake of "adhering too closely to the realist, objectivist discourse of the sciences," the same mistake she criticizes Weiner and Maturana for committing.[24]

Go here to read the rest:

N. Katherine Hayles - Wikipedia

Book giveaway for Posthuman by M.C. Hansen Nov 14-Nov 30, 2022

A Disturbing, Delicious, Page-tuner, that pulls you and keeps you coming back for more.

SUSPENSE and HORROR at its best!

--

Kaufman Striker spent his who

Kaufman Striker spent his whole life learning to be unfeeling; it took hanging himself to change that. Ten years ago, he thought he'd gotten away from being the town's peculiar celebrity; thought he'd gotten away from his father's warped ideas about self-mastery, but his dogmatic dear old dad has reached out from the past to continue his education with a letter encouraging Kaufman to take his own life.

For today in Decoy, Nevada, death isn't permanent.

In an underground military facility, a top-secret resurrection project has been sabotaged. Except scientific resurrection doesn't account for everything. Not the bipedal coyotes that stalk the streets or the thousands of missing town's people, nor Kaufman's own subtle enhancements.

Part psychological thriller, part dystopian sci-fi, Posthuman is a suspense-horror novel that probes what would happen if science discovered proof of life after death and then nudged evolution to take us there. With deep themes and a rich, intricate plot, Posthuman has enough twists, turns, and surprises that once you reach the last page, youll want to start reading it all over again.

Read more:

Book giveaway for Posthuman by M.C. Hansen Nov 14-Nov 30, 2022

Posthuman – Wikipedia

Person or entity that exists in a state beyond being human

Posthuman or post-human is a concept originating in the fields of science fiction, futurology, contemporary art, and philosophy that means a person or entity that exists in a state beyond being human.[1] The concept aims at addressing a variety of questions, including ethics and justice, language and trans-species communication, social systems, and the intellectual aspirations of interdisciplinarity.

Posthumanism is not to be confused with transhumanism (the biotechnological enhancement of human beings) and narrow definitions of the posthuman as the hoped-for transcendence of materiality.[2] The notion of the posthuman comes up both in posthumanism as well as transhumanism, but it has a special meaning in each tradition. In 2017, Penn State University Press in cooperation with Stefan Lorenz Sorgner and James Hughes established the Journal of Posthuman Studies,[3] in which all aspects of the concept "posthuman" can be analysed.[4]

In critical theory, the posthuman is a speculative being that represents or seeks to re-conceive the human. It is the object of posthumanist criticism, which critically questions humanism, a branch of humanist philosophy which claims that human nature is a universal state from which the human being emerges; human nature is autonomous, rational, capable of free will, and unified in itself as the apex of existence. Thus, the posthuman position recognizes imperfectability and disunity within oneself, and understands the world through heterogeneous perspectives while seeking to maintain intellectual rigor and dedication to objective observations. Key to this posthuman practice is the ability to fluidly change perspectives and manifest oneself through different identities. The posthuman, for critical theorists of the subject, has an emergent ontology rather than a stable one; in other words, the posthuman is not a singular, defined individual, but rather one who can "become" or embody different identities and understand the world from multiple, heterogeneous perspectives.[5]

Approaches to posthumanism are not homogeneous, and have often been very critical. The term itself is contested, with one of the foremost authors associated with posthumanism, Manuel de Landa, decrying the term as "very silly."[6] Covering the ideas of, for example, Robert Pepperell's The Posthuman Condition, and Hayles's How We Became Posthuman under a single term is distinctly problematic due to these contradictions.

The posthuman is roughly synonymous with the "cyborg" of A Cyborg Manifesto by Donna Haraway.[citation needed][7] Haraway's conception of the cyborg is an ironic take on traditional conceptions of the cyborg that inverts the traditional trope of the cyborg whose presence questions the salient line between humans and robots. Haraway's cyborg is in many ways the "beta" version of the posthuman, as her cyborg theory prompted the issue to be taken up in critical theory.[8] Following Haraway, Hayles, whose work grounds much of the critical posthuman discourse, asserts that liberal humanismwhich separates the mind from the body and thus portrays the body as a "shell" or vehicle for the mindbecomes increasingly complicated in the late 20th and 21st centuries because information technology puts the human body in question. Hayles maintains that we must be conscious of information technology advancements while understanding information as "disembodied," that is, something which cannot fundamentally replace the human body but can only be incorporated into it and human life practices.[9]

The idea of post-posthumanism (post-cyborgism) has recently been introduced.[10][11][12][13][14]This body of work outlines the after-effects of long-term adaptation to cyborg technologies and their subsequent removal, e.g., what happens after 20 years of constantly wearing computer-mediating eyeglass technologies and subsequently removing them, and of long-term adaptation to virtual worlds followed by return to "reality."[15][16] and the associated post-cyborg ethics (e.g. the ethics of forced removal of cyborg technologies by authorities, etc.).[17]

Posthuman political and natural rights have been framed on a spectrum with animal rights and human rights.[18] Posthumanism broadens the scope of what it means to be a valued life form and to be treated as such (in contrast to certain life forms being seen as less-than and being taken advantage of or killed off); it calls for a more inclusive definition of life, and a greater moral-ethical response, and responsibility, to non-human life forms in the age of species blurring and species mixing. [I]t interrogates the hierarchic ordering and subsequently exploitation and even eradication of life forms.[19]

According to transhumanist thinkers, a posthuman is a hypothetical future being "whose basic capacities so radically exceed those of present humans as to be no longer unambiguously human by our current standards."[20] Posthumans primarily focus on cybernetics, the posthuman consequent and the relationship to digital technology. Steve Nichols published the Posthuman Movement manifesto in 1988. His early evolutionary theory of mind (MVT) allows development of sentient E1 brains. The emphasis is on systems. Transhumanism does not focus on either of these. Instead, transhumanism focuses on the modification of the human species via any kind of emerging science, including genetic engineering, digital technology, and bioengineering.[21] Transhumanism is sometimes criticized for not adequately addressing the scope of posthumanism and its concerns for the evolution of humanism.[22]

Posthumans could be completely synthetic artificial intelligences, or a symbiosis of human and artificial intelligence, or uploaded consciousnesses, or the result of making many smaller but cumulatively profound technological augmentations to a biological human, i.e. a cyborg. Some examples of the latter are redesigning the human organism using advanced nanotechnology or radical enhancement using some combination of technologies such as genetic engineering, psychopharmacology, life extension therapies, neural interfaces, advanced information management tools, memory enhancing drugs, wearable or implanted computers, and cognitive techniques.[20]

As used in this article, "posthuman" does not necessarily refer to a conjectured future where humans are extinct or otherwise absent from the Earth.[23] Kevin Warwick says that both humans and posthumans will continue to exist but the latter will predominate in society over the former because of their abilities.[24] Recently, scholars have begun to speculate that posthumanism provides an alternative analysis of apocalyptic cinema and fiction,[25] often casting vampires, werewolves and even zombies as potential evolutions of the human form and being.[26]

Many science fiction authors, such as Greg Egan, H. G. Wells, Isaac Asimov, Bruce Sterling, Frederik Pohl, Greg Bear, Charles Stross, Neal Asher, Ken MacLeod, Peter F. Hamilton and authors of the Orion's Arm Universe,[27] have written works set in posthuman futures.

A variation on the posthuman theme is the notion of a "posthuman god"; the idea that posthumans, being no longer confined to the parameters of human nature, might grow physically and mentally so powerful as to appear possibly god-like by present-day human standards.[20] This notion should not be interpreted as being related to the idea portrayed in some science fiction that a sufficiently advanced species may "ascend" to a higher plane of existencerather, it merely means that some posthuman beings may become so exceedingly intelligent and technologically sophisticated that their behaviour would not possibly be comprehensible to modern humans, purely by reason of their limited intelligence and imagination.[28]

Read the original here:

Posthuman - Wikipedia

Conceptions of God – Wikipedia

Conceptions of God in monotheist, pantheist, and panentheist religions or of the supreme deity in henotheistic religions can extend to various levels of abstraction:

The first recordings that survive of monotheistic conceptions of God, borne out of henotheism and (mostly in Eastern religions) monism, are from the Hellenistic period. Of the many objects and entities that religions and other belief systems across the ages have labeled as divine, the one criterion they share is their acknowledgment as divine by a group or groups of human beings.

In his Metaphysics, Aristotle discusses the meaning of "being as being". Aristotle holds that "being" primarily refers to the Unmoved Movers, and assigned one of these to each movement in the heavens. Each Unmoved Mover continuously contemplates its own contemplation, and everything that fits the second meaning of "being" by having its source of motion in itself, i.e., moves because the knowledge of its Mover causes it to emulate this Mover (or should).

Aristotle's definition of God attributes perfection to this being, and, as a perfect being, it can only contemplate upon perfection and not on imperfection; otherwise perfection would not be one of his attributes. God, according to Aristotle, is in a state of "stasis" untouched by change and imperfection. The "unmoved mover" is very unlike the conception of God that one sees in most religions. It has been likened to a person who is playing dominos and pushes one of them over, so that every other domino in the set is pushed over as well, without the being having to do anything about it. Although, in the 18th century, the French educator Allan Kardec brought a very similar conception of God during his work of codifying Spiritism, this differs from the interpretation of God in most religions, where he is seen to be personally involved in his creation.

In the ancient Greek philosophical Hermetica, the ultimate reality is called by many names, such as God, Lord, Father, Mind (Nous), the Creator, the All, the One, etc.[1] However, peculiar to the Hermetic view of the divinity is that it is both the all (Greek: to pan) and the creator of the all: all created things pre-exist in God,[2] and God is the nature of the cosmos (being both the substance from which it proceeds and the governing principle which orders it),[3] yet the things themselves and the cosmos were all created by God. Thus, God creates itself,[4] and is both transcendent (as the creator of the cosmos) and immanent (as the created cosmos).[5] These ideas are closely related to the cosmo-theological views of the Stoics.[6]

The Abrahamic God in this sense is the conception of God that remains a common attribute of all three traditions. God is conceived of as eternal, omnipotent, omniscient and as the creator of the universe. God is further held to have the properties of holiness, justice, omnibenevolence and omnipresence. Proponents of Abrahamic faiths believe that God is also transcendent, meaning that he is outside space and outside time and therefore not subject to anything within his creation, but at the same time a personal God, involved, listening to prayer and reacting to the actions of his creatures.

The Bah Faith believes in a single, imperishable God, the creator of all things, including all the creatures and forces in the universe.[7] In Bah belief, God is beyond space and time but is also described as "a personal God, unknowable, inaccessible, the source of all Revelation, eternal, omniscient, omnipresent and almighty."[8] Though inaccessible directly, God is nevertheless seen as conscious of creation, possessing a mind, will and purpose. Bahs believe that God expresses this will at all times and in many ways, including Manifestations, a series of divine "messengers" or "educators".[9] In expressing God's intent, these manifestations are seen to establish religion in the world. Bah teachings state that God is too great for humans to fully comprehend, nor to create a complete and accurate image.[10] Bah'u'llh often refers to God by titles, such as the "All-Powerful" or the "All-Loving".

In many Gnostic systems, God is known as the Monad, or the One.

Within Christianity, the doctrine of the Trinity states that God is a single being that exists, simultaneously and eternally, as a perichoresis of three hypostases (i.e. persons; personae, prosopa): the Father (the Source, the Eternal Majesty); the Son (the eternal Logos ("Word"), manifest in human form as Jesus and thereafter as Christ); and the Holy Spirit (the Paraclete or advocate). Since the 4th Century AD, in both Eastern and Western Christianity, this doctrine has been stated as "One God in Three Persons", all three of whom, as distinct and co-eternal "persons" or "hypostases", share a single divine essence, being, or nature.

Following the First Council of Constantinople, the Son is described as eternally begotten by the Father ("begotten of his Father before all worlds"[11]). This generation does not imply a beginning for the Son or an inferior relationship with the Father. The Son is the perfect image of his Father, and is consubstantial with him. The Son returns that love, and that union between the two is the third person of the Trinity, the Holy Spirit. The Holy Spirit is consubstantial and co-equal with the Father and the Son. Thus, God contemplates and loves himself, enjoying infinite and perfect beatitude within himself. This relationship between the other two persons is called procession. Although the theology of the Trinity is accepted in most Christian churches, there are theological differences, notably between Catholic and Orthodox thought on the procession of the Holy Spirit (see filioque). Some Christian communions do not accept the Trinitarian doctrine, at least not in its traditional form. Notable groups include the Jehovah's Witnesses, Mormons, Christadelphians, Unitarians, Arians, and Adoptionists.

Within Christianity, Unitarianism is the view that God consists of only one person, the Father, instead of three persons as Trinitarianism states.[12] Unitarians believe that mainstream Christianity has been corrupted over history, and that it is not strictly monotheistic. There are different Unitarian views on Jesus, ranging from seeing him purely as a man who was chosen by God, to seeing him as a divine being, as the Son of God who had pre-existence.[13] Thus, Unitarianism is typically divided into two principal groups:

Even though the term "unitarian" did not first appear until the 17th century in reference to the Polish Brethren,[17][15] the basic tenets of Unitarianism go back to the time of Arius in the 4th century, an Alexandrian priest that taught the doctrine that only the Father was God, and that the Son had been created by the Father. Arians rejected the term "homoousios" (consubstantial) as a term describing the Father and Son, viewing such term as compromising the uniqueness and primacy of God,[18] and accused it of dividing the indivisible unit of the divine essence.[19] Unitarians trace their history back to the Apostolic Age, arguing, as do Trinitarians and Binitarians, that their Christology most closely reflects that of the early Christian community and Church Fathers.[20]

Binitarianism is the view within Christianity that there were originally two beings in the Godhead the Father and the Word that became the Son (Jesus the Christ).[citation needed] Binitarians normally believe that God is a family, currently consisting of the Father and the Son[citation needed]. Some binitarians[who?] believe that others will ultimately be born into that divine family. Hence, binitarians are nontrinitarian, but they are also not unitarian. Binitarians, like most unitarians and trinitarians, claim their views were held by the original New Testament Church. Unlike most unitarians and trinitarians who tend to identify themselves by those terms, binitarians normally do not refer to their belief in the duality of the Godhead, with the Son subordinate to the Father; they simply teach the Godhead in a manner that has been termed as binitarianism.

The word "binitarian" is typically used by scholars and theologians as a contrast to a trinitarian theology: a theology of "two" in God rather than a theology of "three", and although some critics[who?] prefer to use the term ditheist or dualist instead of binitarian, those terms suggests that God is not one, yet binitarians believe that God is one family. It is accurate to offer the judgment that most commonly when someone speaks of a Christian "binitarian" theology the "two" in God are the Father and the Son... A substantial amount of recent scholarship has been devoted to exploring the implications of the fact that Jesus was worshipped by those first Jewish Christians, since in Judaism "worship" was limited to the worship of God" (Barnes M. Early Christian Binitarianism: the Father and the Holy Spirit. Early Christian Binitarianism - as read at NAPS 2001). Much of this recent scholarship has been the result of the translations of the Nag Hammadi and other ancient manuscripts that were not available when older scholarly texts (such as Wilhelm Bousset's Kyrios Christos, 1913) were written.

In the Mormonism represented by most of Mormon communities, including the Church of Jesus Christ of Latter-day Saints, "God" means Elohim (the Father), whereas "Godhead" means a council of three distinct entities; Elohim, Jehovah (the Son, or Jesus), and the Holy Spirit. The Father and Son have perfected, material bodies, while the Holy Spirit is a spirit and does not have a body. This conception differs from the traditional Christian Trinity; in Mormonism, the three persons are considered to be physically separate beings, or personages, but indistinguishable in will and purpose.[21] As such, the term "Godhead" differs from how it is used in traditional Christianity. This description of God represents the orthodoxy of the Church of Jesus Christ of Latter-day Saints (LDS Church), established early in the 19th century. However, the Mormon concept of God has expanded since the faith's founding in the late 1820s.[citation needed]

Allh, without plural or gender, is the divine name of God mentioned in the Quran, while "ilh" is the term used for a deity or a god in general.[22][23][24]

Islam's most fundamental concept is a strict monotheism called tawd. God is described in the surah Al-Ikhlas as: "Say: He is God, the One; God, the Eternal, the Absolute; He begot no one, nor is He begotten; Nor is there to Him equivalent anyone."[25][26] Muslims deny the Christian doctrine of the Trinity and divinity of Jesus, comparing it to polytheism. In Islam, God is beyond all comprehension or equal and does not resemble any of his creations in any way. Thus, Muslims are not iconodules and are not expected to visualize God. The message of God is carried by angels to 124,000 messengers starting with Adam and concluding with Muhammad. God is described and referred in the Quran by certain names or attributes, the most common being Al-Rahman, meaning "Most Compassionate" and Al-Rahim, meaning "Most Merciful" (see Names of God in Islam).[27]

Muslims believe that creation of everything in the universe is brought into being by God's sheer command Be and so it is.[28][29] and that the purpose of existence is to please God, both by worship and by good deeds.[30][31] There are no intermediaries, such as clergy, to contact God: He is nearer to his creation than the jugular vein[32]

In Judaism, God has been conceived in a variety of ways.[33] Traditionally, Judaism holds that Yahweh, the God of Abraham, Isaac, and Jacob and the national god of the Israelites, delivered the Israelites from slavery in Egypt, and gave them the Law of Moses at biblical Mount Sinai as described in the Torah. According to the rationalist stream of Judaism articulated by Maimonides, which later came to dominate much of official traditional Jewish thought, God is understood as the absolute one, indivisible, and incomparable being who is the ultimate cause of all existence. Traditional interpretations of Judaism generally emphasize that God is personal yet also transcendent, while some modern interpretations of Judaism emphasize that God is a force or ideal.[34]

Jewish monotheism is a continuation of earlier Hebrew henotheism, the exclusive worship of the God of Israel as prescribed in the Torah and practiced at the Temple of Jerusalem. Strict monotheism emerges in Hellenistic Judaism and Rabbinical Judaism. Pronunciation of the proper name of the God of Israel came to be avoided in the Hellenistic era (Second Temple Judaism) and instead Jews refer to God as HaShem, meaning "the Name". In prayer and reading of scripture, the Tetragrammaton is substituted with Adonai ("my Lord").

Some[who?] Kabbalistic thinkers have held the belief that all of existence is itself a part of God, and that we as humanity are unaware of our own inherent godliness and are grappling to come to terms with it.[citation needed] The standing view in Hasidism currently, is that there is nothing in existence outside of God all being is within God, and yet all of existence cannot contain him.[citation needed] Regarding this, Solomon stated while dedicating the Temple, "But will God in truth dwell with mankind on the earth? Behold, the heaven and the heaven of heavens cannot contain You."[35]

Modern Jewish thinkers have constructed a wide variety of other ideas about God. Hermann Cohen believed that God should be identified with the "archetype of morality," an idea reminiscent of Plato's idea of the Good.[36] Mordecai Kaplan believed that God is the sum of all natural processes that allow man to become self-fulfilled,[37] and Humanistic Judaism fully rejects the notion of the existence of a God.[38]

In Mandaeism, Hayyi Rabbi (Classical Mandaic: , romanized:Hiia Rbia, lit.'The Great Life'), or 'The Great Living God'[39] is the Supreme God from which all things emanate. He is also known as 'The First Life', since during the creation of the material world, Yushamin emanated from Hayyi Rabbi as the "Second Life."[40] According to Qais Al-Saadi, "the principles of the Mandaean doctrine: the belief of the only one great God, Hayyi Rabbi, to whom all absolute properties belong; He created all the worlds, formed the soul through his power, and placed it by means of angels into the human body. So He created Adam and Eve, the first man and woman."[41] Mandaeans recognize God to be the eternal, creator of all, the one and only in domination who has no partner.[42]

The non-adherence[43] to the notion of a supreme God or a prime mover is seen as a key distinction between Buddhism and other religious views. In Buddhism, the sole aim of the spiritual practice is the complete alleviation of distress (dukkha) in samsara,[44][45] called nirvana. The Buddha neither denies nor accepts a creator,[46] denies endorsing any views on creation[47] and states that questions on the origin of the world are worthless.[48][49] Some teachers instruct students beginning Buddhist meditation that the notion of divinity is not incompatible with Buddhism,[50] but dogmatic beliefs in a supreme personal creator are considered a hindrance to the attainment of nirvana,[51] the highest goal of Buddhist practice.[52]

Despite this apparent non-theism, Buddhists consider veneration of the Noble Ones[53] very important[54] although the two main schools of Buddhism differ mildly in their reverential attitudes. While Theravada Buddhists view the Buddha as a human being who attained nirvana or arahanthood through human efforts,[55] Mahayana Buddhists consider him an embodiment of the cosmic dharmakaya (a notion of transcendent divinity), who was born for the benefit of others and not merely a human being.[56] In addition, some Mahayana Buddhists worship their chief Bodhisattva, Avalokiteshvara[57] and hope to embody him.[58]

Buddhists accept the existence of beings known as devas in higher realms, but they, like humans, are said to be suffering in samsara,[59] and not necessarily wiser than us. In fact, the Buddha is often portrayed as a teacher of the gods,[60] and superior to them.[61] Despite this, there are believed to be enlightened devas on the path of Buddhahood.

In Buddhism, the idea of the metaphysical absolute is deconstructed in the same way as of the idea of an enduring "self", but it is not necessarily denied. Reality is considered as dynamic, interactive and non-substantial, which implies rejection of brahman or of a divine substratum. A cosmic principle can be embodied in concepts such as the dharmakaya. Though there is a primordial Buddha (or, in Vajrayana, the Adi-Buddha, a representation of immanent enlightenment in nature), its representation as a creator is a symbol of the presence of a universal cyclical creation and dissolution of the cosmos and not of an actual personal being. An intelligent, metaphysical underlying basis, however, is not ruled out by Buddhism, although Buddhists are generally very careful to distinguish this idea from that of an independent creator God.[62]

In Hinduism, the concept of god is complex and depends on the particular tradition. The concept spans conceptions from absolute monism to henotheism, monotheism and polytheism. In the Vedic period monotheistic god concept culminated in the semi-abstract semi-personified form of creative soul dwelling in all god such as Vishvakarman, Purusha, and Prajapathy. In the majority of Vaishnavism traditions, he is Vishnu, and the text identifies this being as Krishna, sometimes referred as svayam bhagavan. The term isvara - from the root is, to have extraordinary power. Some traditional sankhya systems contrast purusha (divine, or souls) to prakriti (nature or energy), however the term for sovereign god, ishvara is mentioned six times in the Atharva Veda, and is central to many traditions.[63] As per Advaita Vedanta school of Hindu philosophy the notion of Brahman (the highest Universal Principle) is akin to that of god; except that unlike most other philosophies Advaita likens Brahman to atman (the true Self of an individual). For Sindhi Hindus, who are deeply influenced by Sikhism, God is seen as the omnipotent cultivation of all Hindu gods and goddesses.[clarification needed] In short, the soul paramatma of all gods and goddesses are the omnipresent Brahman and are enlightened beings.

Brahman is the eternal, unchanging, infinite, immanent, and transcendent reality which is the divine ground of all matter, energy, time, space, being and everything beyond in this Universe.[64][65] The nature of Brahman is described as transpersonal, personal and impersonal by different philosophical schools. The word Brahman is derived from the verb brh (Sanskrit: to grow), and connotes greatness and infinity.

Brahman is talked of at two levels (apara and para). He is the fountainhead of all concepts but he himself cannot be conceived. He is the universal conceiver, universal concept and all the means of concept. Apara-Brahman is the same Para Brahma but for human understanding thought of as universal mind cum universal intellect from which all human beings derive an iota as their mind, intellect etc.[citation needed]

Ishvara is a philosophical concept in Hinduism, meaning controller or the Supreme controller (i.e. God) in a monotheistic or the Supreme Being or as an Ishta-deva of monistic thought. Ishvara is a transcendent and immanent entity best described in the last chapter of the Shukla Yajur Veda Samhita, known as the Ishavasya Upanishad. It states "ishavasyam idam sarvam" which means whatever there is in this world is covered and filled with Ishvara. Ishvara not only creates the world, but then also enters into everything there is. In Saivite traditions, the term is used as part of the compound "Maheshvara" ("great lord") later as a name for Siva.

Lord Shiva is more often considered as first Hindu God. Mahadeva literally means "Highest of all god". Shiva is also known as Maheshvar, the great Lord, Mahadeva, the great God, Shambhu, Hara, Pinakadhrik, bearer of the axe and Mrityunjaya, conqueror of death. He is the spouse of Shakti, the goddess. He also is represented by Mahakala and Bhairava, the terrible, as well as many other forms including Rudra. Shiva is often pictured holding the damaru, an hour-glass shape drum, shown below with his trishula. His usual mantra is om namah shivaya.[66]

This must not be confused with the numerous devas. Deva may be roughly translated into English as deity, demigod or angel, and can describe any celestial being or thing that is of high excellence and thus is venerable. The word is cognate to Latin deus for "god". The misconception of 330 million devas is commonly objected to by Hindu scholars. The description of 33 koti (10 million, crore in Hindi) devas is a misunderstanding. The word koti in Sanskrit translates to 'type' and not '10 million'. So the actual translation is 33 types and not 330 million devas. Ishvara as a personal form of God is worshiped and not the 33 devas. The concept of 33 devas is perhaps related to the geometry of the universe.

Bhagavan literally means "possessing fortune, blessed, prosperous" (from the noun bhaga, meaning "fortune, wealth", cognate to Slavic bog "god"), and hence "illustrious, divine, venerable, holy", etc. In some traditions of Hinduism it is used to indicate the Supreme Being or Absolute Truth, but with specific reference to that Supreme Being as possessing a personality (a personal God).[citation needed] This personal feature indicated in Bhagavan differentiates its usage from other similar terms such as Brahman, the "Supreme Spirit" or "spirit", and thus, in this usage, Bhagavan is in many ways analogous to the general Christian and Islamic conception of God.

Jainism does not support belief in a creator deity. According to Jain doctrine, the universe and its constituentssoul, matter, space, time, and principles of motionhave always existed. All the constituents and actions are governed by universal natural laws. It is not possible to create matter out of nothing and hence the sum total of matter in the universe remains the same (similar to law of conservation of mass). Jain text claims that the universe consists of Jiva (life force or souls) and Ajiva (lifeless objects). Similarly, the soul of each living being is unique and uncreated and has existed since beginningless time.[67]

The Jain theory of causation holds that a cause and its effect are always identical in nature and hence a conscious and immaterial entity like God cannot create a material entity like the universe. Furthermore, according to the Jain concept of divinity, any soul who destroys its karmas and desires, achieves liberation/Nirvana. A soul who destroys all its passions and desires has no desire to interfere in the working of the universe. Moral rewards and sufferings are not the work of a divine being, but a result of an innate moral order in the cosmos; a self-regulating mechanism whereby the individual reaps the fruits of his own actions through the workings of the karmas.

Through the ages, Jain philosophers have adamantly rejected and opposed the concept of creator and omnipotent God. This has resulted in Jainism being labeled as nastika darsana (atheist philosophy) by rival religious philosophies. The theme of non-creationism and absence of omnipotent God and divine grace runs strongly in all the philosophical dimensions of Jainism, including its cosmology, concepts of karma and moksa and its moral code of conduct. Jainism asserts a religious and virtuous life is possible without the idea of a creator god.[68]

The term for God in Sikhism is Waheguru. Guru Nanak describes God as nirankar (from the Sanskrit nirkr, meaning "formless"), akal (meaning "eternal") and alakh (from the Sanskrit alakya, meaning "invisible" or "unobserved"). Sikhism's principal scripture, the Guru Granth Sahib, starts with the figure "1", signifying the unity of God. Nanak's interpretation of God is that of a single, personal and transcendental creator with whom the devotee must develop a most intimate faith and relationship to achieve salvation. Sikhism advocates the belief in one god who is omnipresent (sarav vi'pak), whose qualities are infinite and who is without gender, a nature represented (especially in the Guru Granth Sahib) by the term Ek Onkar.

Nanak further emphasizes that a full understanding of God is beyond human beings, but that God is also not wholly unknowable. God is considered omnipresent in all creation and visible everywhere to the spiritually awakened. Nanak stresses that God must be seen by human beings from "the inward eye" or "heart" and that meditation must take place inwardly to achieve this enlightenment progressively; its rigorous application is what enables communication between God and human beings.

Sikhs believe in a single god that has existed from the beginning of time and will survive forever. God is genderless, fearless, formless, immutable, ineffable, self-sufficient, omnipotent and not subject to the cycle of birth and death.

God in Sikhism is depicted in three distinct aspects: God as deity; God in relation to creation; and God in relation to man. During a discourse with siddhas (wandering Hindu adepts), Nanak is asked where "the Transcendent God" was before creation. He replies: "To think of the Transcendent Lord in that state is to enter the realm of wonder. Even at that stage of sunn, he permeated all that void" (GG, 940).

The book Western Wisdom Teachings presents the conception of The Absolute (unmanifested and unlimited "Boundless Being" or "Root of Existence", beyond the whole universe and beyond comprehension) from whom proceeds the Supreme Being at the dawn of manifestation: The One, the "Great Architect of the Universe". From the threefold Supreme Being proceed the "seven Great Logoi" who contain within themselves all the great hierarchies that differentiate more and more as they diffuse through the six lower Cosmic Planes. In the Highest World of the seventh (lowest) Cosmic Plane dwells the god of the solar systems in the universe. These great beings are also threefold in manifestation, like the Supreme Being; their three aspects are Will, Wisdom and Activity.

According to the teachings of the Rosicrucian Fellowship, in the beginning of a Day of Manifestation a certain collective Great Being, God, limits himself to a certain portion of space, in which he elects to create the Solar System for the evolution of added self-consciousness. In God there are contained hosts of glorious hierarchies and lesser beings of every grade of intelligence and stage of consciousness, from omniscience to an unconsciousness deeper than that of the deepest trance condition.

During the current period of manifestation, these various grades of beings are working to acquire more experience than they possessed at the beginning of this period of existence. Those who, in previous manifestations, have attained to the highest degree of development work on those who have not yet evolved any consciousness. In the Solar system, God's Habitation, there are seven Worlds differentiated by God, within Himself, one after another. Mankind's evolutionary scheme is slowly carried through five of these Worlds in seven great Periods of manifestation, during which the evolving virgin spirit becomes first human and, then, a God.

Concepts about deity are diverse among UUs. Some have no belief in any gods (atheism); others believe in many gods (polytheism). Some believe the question of the existence of any god is most likely unascertainable or unknowable (agnosticism). Some believe God is a metaphor for a transcendent reality. Some believe in a female god (goddess), a passive god (Deism), an Abrahamic god, or a god manifested in nature or the universe (pantheism). Many UUs reject the idea of deities and instead speak of the "spirit of life" that binds all life on Earth. UUs support each person's search for truth and meaning in concepts of spirituality. Historically, unitarianism and universalism were denominations within Christianity. Unitarianism referred to a belief about the nature of Jesus Christ that affirmed God as a singular entity and rejected the doctrine of the Trinity. Universalism referred to a theological belief that all persons will be reconciled to God because of divine love and mercy (Universal Salvation).[69]

According to Brahma Kumaris, God is the incorporeal soul with the maximum degree of spiritual qualities such as peace and love.[70][71]

Some comparatively new belief systems and books portray God as extraterrestrial life. Many of these theories hold that intelligent beings from another world have been visiting Earth for many thousands of years and have influenced the development of our religions. Some of these books posit that prophets or messiahs were sent to the human race in order to teach morality and encourage the development of civilization (see, for example, Rael and Zecharia Sitchin).

The spiritual teacher Meher Baba described God as infinite love: "God is not understood in His essence until He is also understood as Infinite Love. Divine Love is unlimited in essence and expression, because it is experienced by the soul through the soul itself. The sojourn of the soul is a thrilling divine romance in which the lover, who in the beginning is conscious of nothing but emptiness, frustration, superficiality and the gnawing chains of bondage, gradually attains an increasingly fuller and freer expression of love and ultimately disappears and merges in the Divine Beloved to realize the unity of the Lover and the Beloved in the supreme and eternal fact of God as Infinite Love."[72]

Anton LaVey, founder of the Church of Satan, espoused the view that "god" is a creation of man, rather than man being a creation of "god". In his book, The Satanic Bible, the Satanist's view of god is described as the Satanist's true "self"a projection of his or her own personalitynot an external deity.[73] Satan is used as a representation of personal liberty and individualism.[74] LaVey discusses this extensively in The Book of Lucifer, explaining that the gods worshipped by other religions are also projections of man's true self. He argues that man's unwillingness to accept his own ego has caused him to externalize these gods so as to avoid the feeling of narcissism that would accompany self-worship.[75]

"If man insists on externalizing his true self in the form of "God," then why fear his true self, in fearing "God,"why praise his true self in praising "God,"why remain externalized from "God" in order to engage in ritual and religious ceremony in his name?Man needs ritual and dogma, but no law states that an externalized god is necessary in order to engage in ritual and ceremony performed in a god's name! Could it be that when he closes the gap between himself and his "God" he sees the demon of pride creeping forththat very embodiment of Lucifer appearing in his midst?"

Process theology is a school of thought influenced by the metaphysical process philosophy of Alfred North Whitehead (18611947), while open theism is a similar theological movement that began in the 1990s.

In both views, God is not omnipotent in the classical sense of a coercive being. Reality is not made up of material substances that endure through time, but serially-ordered events, which are experiential in nature. The universe is characterized by process and change carried out by the agents of free will. Self-determination characterizes everything in the universe, not just human beings. God and creatures co-create. God cannot force anything to happen, but rather only influence the exercise of this universal free will by offering possibilities. Process theology is compatible with panentheism, the concept that God contains the universe (pantheism) but also transcends it. God as the ultimate logician - God may be defined as the only entity, by definition, possessing the ability to reduce an infinite number of logical equations having an infinite number of variables and an infinite number of states to minimum form instantaneously.

A posthuman God is a hypothetical future entity descended from or created by humans, but possessing capabilities so radically exceeding those of present humans as to appear godlike. One common variation of this idea is the belief or aspiration that humans will create a God entity emerging from an artificial intelligence. Another variant is that humanity itself will evolve into a posthuman God.

The concept of a posthuman god has become common in science fiction. Science fiction author Arthur C. Clarke said in an interview, "It may be that our role on this planet is not to worship God, but to create him." Clarke's friend and colleague, the late Isaac Asimov, postulated in his story "The Last Question" a merger between humanity and machine intelligence that ultimately produces a deity capable of reversing entropy and subsequently initiates a new Creation trillions of years from the present era when the Universe is in the last stage of heat death. In Frank Herbert's science-fiction series Dune, a messianic figure is created after thousands of years of controlled breeding. The Culture series, by Iain M. Banks, represents a blend in which a transhuman society is guarded by godlike machine intelligences. A stronger example is posited in the novel Singularity Sky by Charles Stross, in which a future artificial intelligence is capable of changing events even in its own past, and takes strong measures to prevent any other entity from taking advantage of similar capabilities. Another example appears in the popular online novella The Metamorphosis of Prime Intellect in which an advanced artificial intelligence uses its own advanced quantum brain to resolve discrepancies in physics theories and develop a unified field theory which gives it absolute control over reality, in a take on philosophical digitalism.

The philosopher Michel Henry defines God from a phenomenological point of view. He says: "God is Life, he is the essence of Life, or, if we prefer, the essence of Life is God. Saying this we already know what is God the father the almighty, creator of heaven and earth, we know it not by the effect of a learning or of some knowledge, we dont know it by the thought, on the background of the truth of the world; we know it and we can know it only in and by the Life itself. We can know it only in God."[77]

This Life is not biological life defined by objective and exterior properties, nor an abstract and empty philosophical concept, but the absolute phenomenological life, a radically immanent life that possesses in it the power of showing itself in itself without distance, a life that reveals permanently itself.

Read the original post:

Conceptions of God - Wikipedia

Eliminating the Black-White Wealth Gap Is a Generational Challenge

Introduction and summary

The importance of household wealth has become abundantly clear during the COVID-19 pandemic. Wealth is the difference between what families ownfor instance, their savings and checking accounts, retirement savings, houses, and carsand what they owe on credit cards, student loans, and mortgages, among other debt.

Yet wealth is vastly unequally distributed across the United States. Black households have a fraction of the wealth of white households, leaving them in a much more precarious financial situation when a crisis strikes and with fewer economic opportunities. Wealth allows households to weather a financial emergency such as a layoff or a family members illness. The pandemic brought multiple such emergencies to American families across all demographics. However, the lack of financial security combined with disproportionate exposure to the deadly coronavirus has had especially disastrous results for the Black community.

Wealth also provides families the means to invest in their childrens education, to start a business, relocate for new and better opportunities, buy a house, and have greater participation in the democratic process. Many households in Black communities cannot afford to pay for reliable internet or electronic devices to facilitate remote learning.1 White workers have been more likely to work remotely during the pandemic and have resources to devote to their childrens remote learning environment, while Black workers are more likely to still be going to work in person. The pandemic has created the perfect storm of factors that will drive wealth for African Americans and white households even further apart.

Wealth is not only a question of financial savings; it provides access to the political process and, therefore, exerts political influence. Households with wealth have a measure of economic security and can donate time and money, thereby influencing the political process and the policies that are important to their communities. Yet, Congress has not devoted enough attention to both the physical and economic harm the coronavirus crisis has wrought on African American communities.

The persistent Black-white wealth gap is not an accident but rather the result of centuries of federal and state policies that have systematically facilitated the deprivation of Black Americans. From the brutal exploitation of Africans during slavery, to systematic oppression in the Jim Crow South, to todays institutionalized racismapparent in disparate access to and outcomes in education, health care, jobs, housing, and criminal justicegovernment policy has created or maintained hurdles for African Americans who attempt to build, maintain, and pass on wealth.

In 2019, the Center for American Progress invited a number of leading national experts on racism and wealth to join the National Advisory Council on Eliminating the Black-White Wealth Gap2 to make eradicating this racial disparity a pressing policy goal for the next presidential administration and to identify steps necessary to accomplish it. This group engaged in a yearlong discussion guided by the following principles:

The importance of addressing the Black-white wealth gap

In 2019, the median wealth (without defined-benefit pensions) of Black households in the United States was $24,100, compared with $189,100 for white households. Therefore, the typical Black household had 12.7 percent of the wealth of the typical white household, and they owned $165,000 less in wealth. The average gap is somewhat smaller in relative terms but much larger in dollar terms. The average Black household had $142,330 in 2019 compared with $980,549 for the average white household. This means that, on average, Black households had 14.5 percent of the wealth of white households, with an absolute dollar gap of $838,220.

The massive Black-white wealth disparity is nothing new in this country. It has persisted for centuries and has been apparent in consistent, nationally representative data for at least three decades. The gap between Black and white households appears to have widened again in the latter part of 2020 as the pandemic and deep recession took hold, especially hurting Black Americans. Black households needed to rely more on their savings to cover both health care emergencies and the economic fallout from layoffs than white households. Just a few months into the pandemic, average wealth for Black households was growing more slowly than that of white householdsa reversal of the pre-pandemic trend.

Low wealth among many Black Americans left them especially vulnerable to the myriad risks of the coronavirus crisis. Black workers were more likely to lose their jobs even as they faced greater health care risks. They worked in jobs with greater exposure to the coronavirus and lived in communities with weaker health care infrastructures. As risks and costs soared, they quickly experienced more material hardship. Hunger, the threat of eviction or foreclosure, and an inability to pay bills were more prevalent among Black households than among white ones. More than two-thirds68.1 percentof Black families with incomes from $35,000 to $100,000 who had lost work during the pandemic indicated that they could not afford all of the food they needed, faced eviction or foreclosure, or had difficulty paying all of their bills from August 2020 to December 2020.3 These situations applied to 49.3 percent of white households in this income category. All types of families have suffered during the recession, but Black families have struggled more because they have fewer savings to fall back on.

Black households face systematic obstacles in building wealth

The persistent Black-white wealth gap is the result of a discriminatory economic system that keeps Black households from achieving the American dream.4 This system has always made it difficult for Black households to acquire and keep capital, and this lack of capital has created a persistently large racial wealth disparity, as African Americans have had less wealth to pass on to the next generation than white households. There are several other obstacles to building wealth:

The unjust obstacles to building wealth for Black households have existed for centuries, and the iterative nature of wealth begetting more wealth means that without public interventions, it will be virtually impossible for Black Americans to catch up to their white counterparts. White families are better situated to pass on wealth from one generation to the next. White households first benefited from the dehumanizing system of slaverydirectly, in this case, as a white slaveholding plantation classbut also from the discriminatory institutions that emerged and persisted after the Civil War. White households have been able to build wealth for themselves and their descendants, while whatever wealth Black families could amass was regularly stripped away. Private businesses and governments institutionalized racism and discrimination. They also encouraged and sanctioned violence targeting Black lives and property. The destruction of Black Wall Street in the Greenwood neighborhood of Tulsa, Oklahoma, in 1921 serves as one of many horrid and systematic examples.6

Following centuries of oppression of Black households, white households are much more likely to receive an inheritance from their parents and grandparents, and their inheritances are much larger than those of Black households.7 Moreover, white households have access to larger and wealthier social networks that they can tap into for job and career opportunities for them and their children. Addressing the persistent Black-white wealth gap means countering the centuries-old institutions that have kept Black households from building and growing wealth at the same rate as is the case for white households.

Novel policy proposals that can help shrink the Black-white wealth gap

The National Advisory Council on Eliminating the Black-White Wealth Gap developed a range of novel policy proposals throughout 2020 that followed the aforementioned principles. These policies are especially targeted toward Black Americans, building and expanding on several existing proposals that could reduce the wealth disparity between Black and white households by helping Black Americans gain more wealth.

Derived from a CAP issue brief published in November 2020, this proposal recommends that the executive branch explicitly prioritize eliminating the Black-white wealth gap, as it is the result of the collective and compounding impact of centuries of oppression. Now there must be a full-scale, intentional, and strategic plan that reaches across the entire federal government and puts in place actual infrastructure to tackle racial inequality. The issue brief provides the Biden administration with a menu of options, many of which have been adopted already.9 They include creating a White House Racial Equity Office; appointing a senior adviser to the president on racial equity; directing the Office of Management and Budget to conduct racial equity assessments on policy measures; adding more principal function to the National Economic Council focused on eliminating the racial wealth disparity; establishing an interagency task force that would provide steps each agency could take toward increasing wealth for Black communities and communities of color; and encouraging agencies to prioritize addressing racial wealth inequality. This menu of options is intended to provide mechanisms by which the federal governmentincluding the White House and federal agencieswould hold itself accountable to the goal of centering race and equity in policymaking.

Black workers and their families have a rare source of opportunity and security in public sector jobs. Government jobs alone cannot solve structural racism, but public sector jobs offer Black workers a greater measure of economic security than they can often find in private sector employment. Secure employment with predictable wages and benefits, a stable working environment, and stronger protections for workers in the public sector has been a significant source of security for Black workers. That also means that slowing growth in government employmentespecially in the wake of the Great Recession of 2007 to 2009represents a disproportionate shrinking of economic opportunities for African American workers.

Amid the fallout from the pandemic, state and local governments have made deep cuts to public sector jobs. Black workers have seen economic gains thanks to their hard work in the public sector. These income and wealth gains are now at risk again. In September 2020, 211,000 fewer Black workers had a job in the public sector than was the case in September 2019.11

In the wake of the COVID-19 pandemic, the federal government can ensure that state and local governments are receiving the funding they needfor instance, with the passage of the American Rescue Plan. Now, these additional funds need to lead states and local governments to bring back jobs in an equitable manner; otherwise, they would risk endangering the financial security of millions of middle-class Black households, threatening to make the wealth gap even harder to solve and undermining one of the only means of substantially reducing racism and racial wealth disparities.

Many households lack access to mainstream banking institutions, which contributes to households being either unbanked or underbanked. This is especially acute for communities of color. Policymakers will need to make long-term structural changes to achieve more equitable outcomes for Black households as the country considers the necessary next steps to rebuild its economy and society after the pandemic and the ensuing economic crisis. Postal banking should be a core part of the U.S. Postal Services mission to deliver services to almost every community in the country. The federal government could provide vital access to financial services by broadening the mandate of the Postal Service to offer postal bankingsuch as a stable bank account for those who are underbanked or unbanked, small loans, and check-cashing serviceswhich could reduce the wealth-stripping effect that exclusionary and predatory financial institutions can cause. Such a system could also serve as a public distribution method for federal and state benefits such as the economic impact payments in the CARES Act or the quarterly or monthly distribution of the earned income tax credit and the child tax credit. Postal banking would overcome a structural barrier for African Americans in the U.S. financial system and would reduce the damage done to many Black households and communities that regularly face predatory lenders and lose large shares of their wealth.

African Americans own fewer than 2 percent of small businesses with any employees, but they make up 13 percent of the U.S. population. In comparison, white households own 82 percent of small employer firms, even though they account for only 60 percent of the U.S. population.

Wide and persistent inequities in wealth and access to capital cause these disparities in small business ownership. The federal government can play an important role in creating a more equitable business environment, even though in the past, it has often perpetuated rather than mitigated these inequities. The Biden administration could help cut small business disparities if it decided to overhaul a long-neglected agency that is part of the U.S. Department of Commercethe Minority Business Development Agency (MBDA). A reenvisioned MBDA could then take the following steps:

Black researchers, inventors, and entrepreneurs face large hurdles in receiving federal research and development (R&D) funds in the current design and application of such funds. The Biden administration and Congress can lower racial gaps in R&D funding and offer a pathway for R&D dollars that both dedicate funding to Black-led research and establish an innovation dividend.

The proposal, developed in a previous CAP report, envisions additional financial support for R&D by Black inventors and entrepreneurs:

The proposal further envisions the creation of an innovation dividend. The federal government would have to spend $125 billion annually in new R&D, which is higher than the current low of about $100 billion per year. Underlying this calculation is the assumption that the federal governments annual R&D spending will grow with gross domestic product, based on the Congressional Budget Offices (CBO) long-term economic projections. Each new and successful investment is assumed to last for 20 years. This is equal to the usual patent protection length. The calculation further assumes that all investments create an average noninflation-adjusted rate of return of about 3 percent. This is close to the long-term, risk-free rate of return assumed by the CBO but well below historical averages. The federal government can receive the extra value of these investments. Private companies profits then only come from private sector investments. The federal government can pay out these funds as innovation dividendstypically in the form of targeted cash paymentsto Black Americans, who have been left out of innovation funding for decades.

Even with innovative policy solutions, the Black-white wealth gap will persist

The data for the past three decades show large and persistent disparities in wealth, assets, and debt between Black and white households. Wealth is the difference between what households owntheir assetsand what they owetheir debt. For most households, assets are larger than debt, meaning they own at least some wealth. Assets include peoples houses, their retirement accounts, their checking and savings accounts, and their cars, for example. The expected future income from an employers pension is a somewhat unique asset. On the one hand, it provides households with a secure stream of income in the future; on the other hand, it is not an asset that households can borrow against or pass on to their heirs. The table below shows wealth inequality between Black and white households both with and without defined-benefit pension wealth.

The data highlight several key points. First, Black households have a fraction of the wealth of white households. For instance, the median wealth of Black households with defined-benefit pensions was $40,400 in 201915.5 percent of the $258,900 in median wealth for white families. (see the downloadable table)15 The smallest relative gap that can be found between Black household wealth and white household wealth exists for average wealth that includes defined-benefit pensions as part of household wealth. Using this measure, Black households wealth amounts to 22.5 percent of white households wealth. (see Figure 1) In comparison, the largest gap that can be found between Black and white household wealth is median wealth without defined-benefit pensions included. Using this measure, Black households own 12.7 percent of the wealth of the median white household. No matter which wealth measure is used, Black households have far less wealth than white ones.

Figure 1

Second, defined-benefit pensions have a slightly equalizing effect. The Black-white wealth gap shrinks somewhat when the imputed value of defined-benefit pensions is counted as an asset. This equalizing effect is larger for average wealth than for median wealth. For example, average Black household wealth increases from 14.5 percent of average white household wealth without defined-benefit pensions to 22.5 percent with defined-benefit pensions; the Black-white wealth gap shrinks by 8 percentage points. At the median, the effect is only a 1.8 percentage-point decrease. That is, the effect of a little more wealth equality thanks to defined-benefit pensions matters mainly for higher-income earners with stable jobs. Since such opportunities are often rare for Black workers in the private sector, the effect is much smaller at the median.

A key point, which is not shown in Figure 3 but is apparent in the same data, is that Black workers have more access to stable jobs with good benefitsincluding defined-benefit pensionsin the public sector than in the private sector. As a result, wealth inequality among public sector workers is much smaller than among private sector workers.16 This effect becomes even larger when comparing public sector workers in unionized jobs with their private sector counterparts who are not covered by a collective bargaining agreement.17 Access to stable, well-paying jobs with decent benefits is rarer for Black workers than for white ones. Such accesswhich is more common in the public sector than in the private sectorcan help shrink but not eliminate the Black-white wealth gap in large part because of the value of a defined-benefit pension.

Third, there is no long-term trend toward a smaller Black-white wealth gap. In fact, the relative difference between Black households wealth and that of white households was generally smaller from 1992 to 2007 than in the years after the Great Recession. For instance, the median wealth with defined-benefit pensions of Black households amounted to 20.1 percent of white households in 1998 and 19.8 percent in 2004. Since the Great Recession, this ratio of Black households median wealth to white households median wealth reached its highest point of 15.5 percent in 2019. Black households wealth has always been far below that of white households in the past three decades.

Fourth, the wealth gap persists even when the data account for income differences. Black households have much lower wealth-to-income ratios than white households do. For example, the median wealth-to-income ratio that includes the imputed wealth of defined-benefit pensions has rarely exceeded 100 percent for Black households. (see the downloadable table)18 However, it has never fallen below 300 percent for white households, and it stood at 395.5 percent in 2019. That is, the large Black-white wealth gap does not follow from lower incomes among Black households.

In the same vein, the data show large Black-white wealth gaps among separate subpopulations. (see Table 1) The table breaks the data down by education, family status, age, and income in addition to race. In all groups, white households have vastly more wealth than Black households. The overall Black-white wealth gap is then not a result of differences in these characteristics. For example, white households with high school degrees have $151,651 more in wealth on average than Black households with a college degree. In fact, white households without a high school degree have similar wealth levels as Black households with college degrees$230,165 compared with $270,288. Other research at the more regionally granular level has regularly found that white households without a high school degree have, on average, more wealth than Black households with a college degree.19 Put differently, Black Americans gaining more education does not close the Black-white wealth gap. The data indicate similar conclusions about income levels and marital status.20 Black Americans clearly encounter massive and systematic obstacles that make it impossible to catch up to their white counterparts.

Table 1

The data in Table 1 on wealth by age in fact suggest that these obstacles are cumulative. The Black-white wealth gap tends to be larger for older groups of households than for younger ones. Data for married couples broken down by cohorts show that the Black-white wealth gap widens as people get older.21 Black Americans encounter systematic obstacles and systemic racism when trying to save for their future, while white households receive additional help from their familiesfor example, in the form of more frequent and larger inheritancescausing the Black-white wealth gap to grow over peoples lifetime.22

Fifth, Black households wealth declined more after the Great Recession than was the case for white households. And white households wealth grew faster in the immediate aftermath of that financial and economic crisis than was the case for Black households wealth. Regardless of the measure of wealthmedian or mean, with or without defined-benefit pensionsthe gap between Black households and white households wealth was larger in 2019 than in 2004 and 2007, before the Great Recession started.

Additional Federal Reserve data suggest that the recession of 2020 could show a similar pattern of a widening Black-white wealth gap during a recession. Figure 2 shows the average wealth with defined-benefit pensions for Black and white households.23 The Black-white wealth gap widened over the course of the recession through September 2020. The average wealth of Black households was $241,951, which was 0.7 percent below the $243,764 recorded at the end of 2019, before the recession started. In contrast, average white household wealth was 3.3 percent higher with $1.17 million in September 2020 compared with $1.13 million at the end of 2019. Black households wealth recovered more slowly than that of white households, widening the wealth disparity continuously throughout the recession.

Figure 2

Several reasons account for this widening disparity between Black and white wealth during recessions. First, on average, Black workers always have worse labor market experiences than white workers. (see Figure 3) They suffer from higher unemployment, longer spells of unemployment, earlier layoffs in a recession, later rehiring in a recovery, more job instability, and lower wages.24 Less access to good, stable jobs means that African Americans have fewer opportunities to save money as well as more need to rely on their savings because they face more labor market risks.

Figure 3

Second, Black households are less likely to own stocks than white households, often because they face more economic risks such as higher chances of layoffs and medical emergencies than white households.25 They also have less access to retirement benefits through their employers, which is one key pathway for more saving and stock market investments for American families.26 African Americans then see fewer wealth gains from a booming stock market, as typically happens starting from the later stages of a recession.

Even worse, the combination of higher unemployment during the recession and fewer stock market investments to begin with means that Black households have fewer opportunities to take advantage of low stock prices in the middle of a recession than white households.27 Black households have less money to invest at a time when the opportunities to invest in the stock market are best because of low stock prices. White households, on the other hand, are more likely to still have a job with higher incomes and more access to stock market investments through employer-sponsored retirement accounts. They can take advantage of low stock prices in the depths of a recession and thus see higher rates of return on their wealth.

Third, Black households are less likely to own their own houses than white households.28 Housing prices have largely stayed strong and even increased in this recession. Black households see fewer gains from such price increases than white households. Worse, even when Black households own their homes, they see smaller price gains than white homeowners do. Their home values increase at a lesser rate because of housing and mortgage market discrimination, fewer public services, and less access to good jobs in predominantly African American communities.29 In essence, wealth leads to more wealth, and this pattern becomes readily apparent in a recession.

As discussed above, the differences in Black-white wealth overall and in rates of return stem from massive gaps in assetsnot from more debt among African Americans. On the contrary, Black households typically have less debt than white households do, often because they are shut out of formal credit channels due to financial market discrimination.30 Black households instead owe a lot of so-called consumer credit such as car and student loans as well as credit cards. Yet they are less likely to have a mortgage due to greater loan denial rates and less access to down payment help from family. The heavy reliance on consumer debt means that the amount of consumer loans to consumer durablesa measure of how much families need to use debt for ongoing expensesis higher for African Americans than for any other racial or ethnic group. Black households essentially use consumer debt to cover part of their expenses, while white households go deeper into mortgage debt to invest in an asset that appreciates.31 African Americans then owe more costly and risky debt such as car loans and credit card debt and thus often pay more for their debt than white households do, but the amount of debt that Black households owe is smaller in absolute terms and relative to income than is the case for white households.32 High-cost and high-risk debt is a key aspect of wealth stripping in the African American community, but it is not the overarching contributor to the Black-white wealth gap. A systematic lack of access to opportunities for owning and maintaining assets is the primary cause.

Conclusion

The work of the National Advisory Council on Eliminating the Black-White Wealth Gap shows two important things. First, it is possible to develop and enact in short order a number of policies that could have a meaningful long-term effect on reducing the Black-white wealth gap. Second, a smallerbut still substantialBlack-white wealth gap would persist, even if policymakers enacted all policies mentioned in this report in addition to several large-scale proposals proposed by CAP and others. Eliminating the disparities between Black and white wealth is a generational undertaking, but it is one that this country can and must tackle.

The proposals summarized in this report show that it is possible to enact novel policies to shrink the Black-white wealth gap. These proposals expand the portfolio of possible new measures to address this massive inequality. Other policies that can also shrink this wealth disparity include so-called baby bondsannual payments to children under the age of 18 that are tied to parents income or wealth.33 They also include debt-free college education, universal retirement accounts,34 full enforcement of civil rights legislation in housing markets, and strict regulation and enforcement of financial market regulation in all credit and asset markets.35

A key difference between the novel proposals laid out in this report and already-proposed policies is that the new approaches focus solely or primarily on lifting up wealth for African Americans, while other proposals largely favor Black households but also provide help to white families in building wealth. That is, these new proposals could have a substantial effect on shrinking the Black-white wealth gap.

But a substantial Black-white wealth gap will remainat least between average wealth for Black families and average wealth for white familieseven if all of these proposals were immediately enacted. Broad measures that benefit both Black and white households have a diffuse effect on the Black-white wealth gap at the average, although they can substantially shrink this wealth disparity at the median.36 At the same time, targeted proposals laid out in this report will take time to have a meaningful effect. Moreover, the sum of these proposals does not fully erase the massive intergenerational advantage that white households have in building wealth.

These intergenerational wealth transfers come in the form of gifts and inheritances as well as access to social networks. For the years 2010 to 2019, white households in which the heads of household were between the ages of 55 and 64 years old had received gifts and inheritances equal to $101,354 (in 2019 dollars). In comparison, Black households had received $12,623 at that time. Furthermore, older white households expected to get an additional $75,214 as gifts and inheritances, while Black households expected $2,941. This represents a total gap of $161,004 in received and expected gifts and inheritances and does not count additional intergenerational wealth transfers such as nepotism and access to social networks.37

In this regard, it is important to note that experts, researchers, and policymakers are considering the rationale, design, and effects of reparations to Black households to address the lasting economic impacts of slavery. One legislative vehicle currently pending in Congress to study and put forward a plan for implementation of reparations is H.R. 40.38 Originally introduced by the late Rep. John Conyers (D-MI) every year between 1989 and 2017, and subsequently introduced by Rep. Sheila Jackson Lee (D-TX), H.R. 40 would create a commission to study and submit to Congress a report on reparations for the government-sanctioned institution of slavery and ensuing discrimination against freed slaves and their descendants. Notably, this bill only proposes a study and recommendations; passage of the bill would not necessarily lead to reparations. Unless legislation to study reparations passes, the executive branch should engage with cultural and historical resourcessuch as the National Archives and Records Administration, the Smithsonian Institution, and the National Park Serviceto promote historical education for the public to increase awareness of the myriad underlying causes that have contributed to the massive and persistent Black-white wealth gap.

Moreover, public and private policies need to be regularly revisited and revamped to eliminate racial biases that systematically disadvantage Black households. Without large, long-term investments in addressing the Black-white wealth gap, massive differences in economic security and opportunity will not only continue to persist but may widen for generations.

About the authors

Christian E. Weller, Ph.D., is a senior fellow at the Center and a professor of public policy at the McCormack Graduate School of Policy and Global Studies at the University of Massachusetts Boston.

Lily Roberts is the managing director for Economic Policy at the Center for American Progress.

Acknowledgments

The Center for American Progress would like to thank the members of the National Advisory Council on Eliminating the Black-White Wealth Gap for all of their time, hard work, inspiration, and thought leadership. We are especially grateful to co-chairs Darrick Hamilton and Kilolo Kijakazi for sharing their critical insights, deep expertise, and long-standing commitment to racial justice. This project would not have been possible without the vision and untiring commitment to racial equity from Danyelle Solomon, former vice president for Race and Ethnicity at the Center for American Progress. To learn more about the council, read: CAP Announces Formation of the National Advisory Council on Eliminating the Black-White Wealth Gap.

Appendix

Kilolo Kijakazi, institute fellow, Urban Institute; co-chair

Darrick Hamilton, executive director, Kirwan Institute for the Study of Race and Ethnicity, The Ohio State University; co-chair

Mehrsa Baradaran, professor of law, University of California, Irvine

Lisa D. Cook, associate professor of economics and international relations, Michigan State University

Henry Louis Skip Gates, Alphonse Fletcher Jr. University professor and director, Hutchins Center for African and African American Research, Harvard University

Ibram X. Kendi, professor of history and international relations and founding director, Antiracist Research and Policy Center, American University

Trevon Logan, professor of economics andassociate dean, College of Arts and Sciences, The Ohio State University

Anne Price, president, Insight Center

Richard Rothstein, distinguished fellow, Economic Policy Institute; senior fellow, emeritus, Thurgood Marshall Institute of the NAACP Legal Defense Fund and of the Haas Institute at the University of California, Berkeley

Rhonda Sharpe, founder and president, Womens Institute for Science, Equity, and Race (WISER)

Read more:

Eliminating the Black-White Wealth Gap Is a Generational Challenge

FM-2030 – Wikipedia

Iranian-American-Belgian transhumanist philosopher and futurist

FM-2030

FM-2030 (originally born as Fereidoun M. Esfandiary; Persian: ; October 15, 1930 July 8, 2000) was a Belgian-born Iranian-American [1] author, teacher, transhumanist philosopher, futurist, consultant, and Olympic athlete.[2]

He became notable as a transhumanist with the book Are You a Transhuman?: Monitoring and Stimulating Your Personal Rate of Growth in a Rapidly Changing World, published in 1989. In addition, he wrote a number of works of fiction under his original name F.M. Esfandiary.

FM-2030 was born Fereydoon M. Esfandiary on October 15, 1930 in Belgium to Iranian diplomat Abdol-Hossein A. H. Sadigh Esfandiary (1894-1986), who served from 1920 to 1960. [3] He travelled widely as a child, having lived in 17 countries including Iran, India, and Afghanistan, by age 11.[4] He represented Iran as a basketball player and wrestler at the 1948 Olympic Games in London. He attended primary school in Iran and England and completed his secondary education at Colleges Des Freres, a Jesuit school in Jerusalem. By the time he was 18, aside from his native Persian, [5],he learned to speak 4 languages: Arabic, Hebrew, French and English.[6][7] He then started his college education at the University of California, Berkeley, but later transferred to the University of California, Los Angeles, where he graduated in 1952.[8] Afterwards, he served on the United Nations Conciliation Commission for Palestine from 1952 to 1954.[9]

In 1970, after publishing his book Optimism One,[10] F.M. Esfandiary [7] started going by FM-2030 for two main reasons: firstly, to reflect the hope and belief that he would live to celebrate his 100th birthday in 2030; secondly, and more importantly, to break free of the widespread practice of naming conventions that he saw as rooted in a collectivist mentality, and existing only as a relic of humankind's tribalistic past. He legally changed in 1988. He viewed traditional names as almost always stamping a label of collective identity varying from gender to nationality on the individual, thereby existing as prima facie elements of thought processes in the human cultural fabric, that tended to degenerate into stereotyping, factionalism, and discrimination. In his own words, "Conventional names define a person's past: ancestry, ethnicity, nationality, religion. I am not who I was ten years ago and certainly not who I will be in twenty years. [...] The name 2030 reflects my conviction that the years around 2030 will be a magical time. In 2030 we will be ageless and everyone will have an excellent chance to live forever. 2030 is a dream and a goal."[11] As a staunch anti-nationalist, he believed "There are no illegal immigrants, only irrelevant borders."

In 1973, he published the political manifesto UpWingers: A Futurist Manifesto in which he views the ideological left and right as outdated and instead proposed a schema of UpWingers: those who looked into the sky and future and DownWingers, those who looked into the Earth and past. FM-2030 identified with the former. He argued that the nuclear family structure and the idea of a city would disappear, being replaced by modular social communities called mobilia, powered by communitarianism and would persist and then disappear.[12]

FM-2030 believed that synthetic body parts would one day make life expectancy irrelevant and shortly before his death, he described the pancreas as a "a stupid, dumb, wretched organ."[13]

In terms of civilization, he stated: No civilization of the past was great. They were all primitive and persecutory, founded on mass subjugation and mass murder. In terms of identity, he stated The young modern is not losing his identity. He is gladly disencumbering himself of it. He believed that eventually, nations would disappear and that identities would shift from cultural to personal. In a 1972 op-Ed in The New York Times, he wrote that the Arab-Israeli conflict had failed leadership, the warring sides acting like adolescents, refuse to resolve their wasteful 25-year-old brawl and believed that the World was irreversibly evolving beyond the concept of national homeland.[14]

He was a lifelong vegetarian and said he would not eat anything that had a mother. He famously refused to answer any questions about his nationality, age and upbringing, deeming them to be irrelevant and that he was a global person.[15] FM-2030 once said, "I am a 21st century person who was accidentally launched in the 20th. I have a deep nostalgia for the future."[16] As he spent much of his childhood in India, he was noted to have spoken a slight Indian accent.[17] He taught at The New School, University of California, Los Angeles, and Florida International University.[2] He worked as a corporate consultant for Lockheed and J. C. Penney.[2] He was also an atheist.[18]FM-2030 was, in his own words, a follower of "upwing" politics, in which he meant that he endorsed universal progress.[19][20] He had been in a non-exclusive "friendship" (his preferred term for relationship) with Flora Schnall, a lawyer and fellow Harvard Law Class of 1959 graduate from the 1960s until his death. FM-2030 and Schnall attended the same class as Ruth Bader Ginsburg.[21] He resided in Westwood, Los Angeles as well as Miami.[22]

FM-2030 died on July 8, 2000 from pancreatic cancer at a friends apartment in Manhattan. He was placed in cryonic suspension at the Alcor Life Extension Foundation in Scottsdale, Arizona, where his body remains today. He did not yet have remote standby arrangements, so no Alcor team member was present at his death, but FM-2030 was the first person to be vitrified, rather than simply frozen as previous cryonics patients had been.[15] FM-2030 was survived by four sisters and one brother.[7]

See the rest here:

FM-2030 - Wikipedia

Biden Builds Transhuman Cyborg Army using Immigrants!

Join DeAnna Lorraine on her new show Shots Fired! with DeAnna Lorraine! DeAnna first goes over hottest headlines of the day, joined by Lauren Wtizke - they discuss the disastrous Hurricane Ian and potential weather manipulation, the sabotage of Nord Stream, another huge fire at a major food production plant that feeds millions, Coolio dead, and more.

Then DeAnna is joined by a prominent natural doctor, Dr. Jason Dean, who goes into more depth about the new demonic Executive Order that Biden just signed, which will be accelerating Transhumanism and Genetic Modification - and even creating an army using immigrants and minorities! Must watch and share interview!

Subscribe to Red Voice Media Premium using promo code DEANNA to watch the full Shots Fired show with DeAnna Lorraine, weekdays every Thursday at 6pm CT!

Make sure you FOLLOW DeAnna on Gettr, Truth Social, Telegram and Gab: @RealDeAnnaLorraine and join her Telegram channel and live chat during the show and throughout the week! http://t.me/deannasChannel

Read this article:

Biden Builds Transhuman Cyborg Army using Immigrants!

Crisis of the Weak-of-Survival Being Favored over the Strong-of …

The Power Ape-Man in 2001: A Space Odyssey

Suppose there are organisms in a certain environment. They compete with other organisms and even face the invasion by foreign organisms. Within these native organisms, there are those that react strongly to threats and act accordingly. They fight or build defenses. And then, there are those that are passive, weak, or even welcoming of rivals or invaders. Over time, what will happen? The Law of Survival will weed out the weak members as theyll be conquered and devoured by rivals or invaders. Meanwhile, the strong members will survive with their tenacity and fighting spirit. In time, the organisms will be defined by the survivors with the spirit of warriors. That way, the organisms will remain strong.

But what if a different set of dynamics takes hold of this environment? Suppose there is a Power that coddles and protects the weak-willed members of the organism while hampering the strong-willed members that are exposed to constant attacks and invasions. The weak-willed survive because they dont have to fight under the protection of the Power. In contrast, the strong-willed come under ceaseless pressure. Furthermore, they are prevented by the Power from using all means at their disposal to counter the attacks and invasions. What will happen over time? The strong-willed will wither, fade, and eventually be forced to cower before the enemy. After all, even the strongest bear or biggest bull can eventually be brought down by a pack of wolves; even a giant lizard succumbs to a massive killer ant attack. Meanwhile, the weak-willed members survive and even thrive but as pathetic puppets and minions of the Power that protects them(and subverted the defensive capability of the defeated strong-willed members).

Imagine an environment with lots of chimps. Among them, there are strong-willed chimps and weak-willed chimps. Strong-willed chimps are vigilant, always on the lookout, and ready to fight for territory, females, and food. Weak-willed chimps, on the other hand, are passive and kindly toward outsiders, be they rival chimps or dangerous animals(such as leopards). Now, when crisis breaks out, the strong-will chimps will prioritize survival and go into fight-or-flight mode. Fight those that can be defeated, take flight from the stronger, and set up a wall of defense. In contrast, the weak-willed members will be slower to flee from danger. They may even move toward danger as a friend. Theyll act like the dufus scientist in the 1950s sci-fi horror THE THING, a naive brainiac who seeks to commune with and understand the fearsome and ruthless creature from another planet. Over time, as the weak-willed chimps will be weeded out by murderous enemy chimps and predators, the chimp community will have more strong-willed members.

But suppose a Power takes over the chimp community. It creates a well-stocked sanctuary for the weak-willed chimps that thus become favored in the game of existence. Despite possessing traits disadvantageous for survival, they are favored and coddled by the Power. The strong-willed chimps get no such protection and are therefore disadvantaged in survival. They must fight and struggle to survive, and tough as they are, some are destroyed or devoured by rival chimps and predators. But there is worse. The Power decides to make things more difficult for the strong-willed apes. Their fangs are ground down so their bites are far less effective. Also, they are supplied with narcotics, and many succumb to addiction. Under such organizing principles, the weak-willed members survive(but essentially as chattel dependent on the protection/mercy of the Power) while the strong-willed members dwindle in number and eventually become destroyed.

In a way, the favoring of the weak-willed over the strong-willed is the story of civilization. It is also a strategy of power. It can be advantageous to a people if they control the terms of domestication, but it can be disadvantageous(and eventually fatal) if the terms are controlled by another group.

There are parallels between humans and dogs, though some human groups and certain dog breeds became more domesticated than others. The Golden Retriever became more domesticated than the Alaskan Husky that, despite living with man, still came under tremendous natural pressures in freezing climates and in proximity with dangerous predators such as polar bears and wolves. Dogs are weaker and smaller than wolves, their ancestors. They are also weaker-willed and more prone to trust and be friendly with other organisms, especially humans. As such, humans favored and protected dogs. But humans also owned dogs as property, as pets and servants. Thus, even though countless dogs led far safer and happier lives in the protective human realm than wolves did in the wild, they were at the mercy of their human masters. But humans didnt merely favor dogs over wolves but made a concerted effort to make things difficult and often deadly for the wolves. Therefore, even though wolves have greater survival skills than dogs if both were placed in the same wilderness indeed, its likely that most, even all, dogs will be destroyed in the wild , the Power of Man has made it so that weak-willed dogs have far greater chance of survival than wolves in the wild(that has been limited to wilderness preserves). The interference of the Power made it so that the wolfs natural advantage became a disadvantage whereas the natural disadvantage of the dog became an advantage under Man. After all, mankind naturally prefers the trusting, submissive, and friendly dog to the ferocious and proud wolf. Dogs have done better under humans than in the wild but at the loss of all pride, autonomy, and independence. Still, as they are animals, pride doesnt matter much to them. But what about people whove lost pride and independence?

But then, can real pride and independence exist in civilization? After all, if people, as truly free individuals, chose to do as they please, civilization would fall apart. Imagine a world run over by Alexes of A CLOCKWORK ORANGE. Despite all the talk of freedom and individualism, the main reason why modern civilization holds together and continues is because most or majority of the people support or serve the hierarchy and adhere to the values and narratives pushed by the Power. Also, the Power enforces the same sets of laws, language, and lore over the vast populace. Under communism in the Soviet Union, the law was Marxist-Leninist. The language of the empire was Russian. And all children were raised on the lore of communist saints and heroes. Theres been far more freedom in the West, but the system cannot be sustained unless enough people adhere to the existing Power Structure. For most people in the West, there is a measure of freedom in their personal lives but hardly any freedom or means to change the workings of the existing power structure. Only a handful of people with the means to enter the inner sanctums of power can make a real difference. Also, even personal choices are shaped, even dictated, by a handful of big players. Most movies are made by Hollywood, or Movie Inc. People choose from what is offered to them by mega-corporations, just like voters choose from a bunch of politicians vetted by the ruling power, i.e. people vote for puppets, not leaders. People may select from various media outlets that create the impression of choice, but most of media are controlled by a handful of Jewish oligarchs. People may choose the kind of music they like, but pop trends are dominated by a few entertainment oligopolies. There was talk of how the internet would unleash an era of citizen journalism and alternative views, but the biggest platforms are dominated by Zionist Jews who shut down what they deem as hate speech. Jewish oligarchs at Google also manipulate algorithms so that Jew-run news are favored in search results over voices critical of Jewish Supremacism. Therefore, what is called free press and free speech are highly proscribed and controlled in the Free West. Indeed, paradoxically enough, people in a democracy might be even more clueless as to whats really happening because the conceit of liberty and freedom blinds them to the fact that they arent so free. At least, people in Iran and China know their freedoms are restricted by the State. In the West, many are still under the delusion of living in a liberal democracy when, if anything, they are minions of a Jewish Supremacist Oligarchy. Labels can fool a lot of people. Its like the fat-free label that fools so many people who dont realize that the fat has been replaced by more sugars. Same with progress and conservative. So much of what is nowadays labeled as progressive or conservative is anything but. So-called progressive Democrats are totally in cahoots with Wall Street that push globo-homo to replace May Day with Gay Day. And so-called conservative Republicans are now into chanting gay marriage and trannies-in-washrooms are conservative values. How the world loves a label than the reality.

In a way, this loss of true freedom and independence is the price we all paid for civilization. A civilization can be more free or less free, but when push comes to shove, it must be about most or the great majority submitting to the power, the status quo. Those in power may change American Power went from Wasp Rule to Jewish Rule , but regardless of who are on top, most people must go along. So, Russia went from the people obeying the Czars to obeying the Commissars to obeying the oligarchs. And most Germans went from obeying the Kaiser to obeying the Weimar Republic to obeying the Nazis to obeying the bureaucrats in West Germany or East Germany. Even if many people are cynical about power and disrespect the ruling elites, theyve no choice but to go through the daily motion of working for the system. In other words, even the disobedient find theyve no choice but to obey to make a living.

And even when the people do rise up and overthrow the existing system, as in the case of Shahs Iran, the only way civilization can continue is if most people support or comply with the new order. Civilization cannot tolerate too many wolves. It needs lots of dogs. As for controlling the power, it usually goes to the weasels. George Orwell in ANIMAL FARM illustrated how the banishment of humans only led to the rise of Pig Tyranny. But then, as bad as the pigs are, can the animals govern themselves? Besides domestication means to become part of a system, an order based on organizational principles. It is then the nature of domesticated organisms to long for the iron hand, albeit so-called Liberal Democracy learned to cover it with a velvet glove. As individuals, we can only be so free. After all, we dont want to live in a world of chaos where everyone, as an independent maverick, makes up his or her own rules. This is so many manifestations of rebellion and difference in a liberal democracy are manufactured as a chimera by the Power. Have the rebels conform to officially tolerated or approved forms of rebellion, like cheering loudly at Rock concerts, piercing ones nose, or turning ones hair green, all of which are harmless to the Power(while harmful to the pride of resistance). All these differences lead to new conformist communities than truly independent turns of mind and spirit. Its like the Powers idea of dark web dissident right turned out to be Zionists like Ben Shapiro & Dave Rubin and shills of Zionists like Jordan Peterson. But then, even if dissident rightists were to come to power, wouldnt they prop up their favored Norms and Sacraments as the governing principle in the new order?

Civilization must favor the mild-willed over the strong-willed. While weak-will is too sappy, strong-will is too contentious. While society gains something by having some strong-willed leaders and alphas, most people must be less-strong-willed if people are to get along and go along. (Also, if two civilizations are defined by mild-mannered-ness, they may find ways to co-exist and cooperate than remain locked in terms of conflict. Mild-willed outlooks can serve as roads and bridges between civilizations.)

If everyone were strong-willed, itd be an endless battle of egos. Therefore, most people must be mild-willed, somewhere between weak-will and strong-will. And the meritocratic system is geared to favor mild-willed over strong-willed, that is unless the strong-willed happen to be particularly gifted in intellect, creativity, or leadership qualities. After all, what is required to do well in school, gain credentials, and find good jobs? One must be patient and diligent. One must be reasonably obedient to teachers and authority figures. Despite the American mythos of the cool rebel, most people who succeed play by the rules. No wonder women and Asians are favored in the current order. Both are more mild-willed than white males who tend to be a bit more adventurous and cantankerous in spirit. Obama certainly understood whos boss(the Jews) and did as told to be handpicked to be president, or cuck-in-chief of the Jews. One reason why Jews cant stand Donald Trump is the way he became president. He howled too much like a wolf than acted the well-heeled canine in a dog-show. Though a total dog to Jews in substance, he was wolf in style, and the Jewish Masters of America took this very badly, and the whole Russian Collusion Hoax and other nonsense were a means to punish the Bad Doggy.

Anyway, precisely because civilization favors the mild-willed over the strong-willed for most of its managerial positions, there is the real danger of a survival-deficit in elite ranks of society. Consider nations like Sweden. Well-ordered and well-run, peaceful and prosperous Sweden elevated mild-willed individuals to upper levels of government and institutions. Indeed, its military is run by a bunch of mild-willed women who did the homework and did as told in their student days. So, is it any surprise that the Swedish state is so soulless, gutless, and bland? Its managerial class may be well-educated, diligent, and competent on the technical level, but they lack patriotic passion, survival instinct, and requisite ruthlessness toward potential threats and enemies. If anything, it is most triggered by the emergence of strong-willed Swedes who see what is happening and demand that something drastic be done to stop the invasion and great replacement.

Since individuals cant be truly free and independent within a civilization, the only way for a people to be free is as a collective. While Me-the-Person can only be so free within the Order, We-the-People can be free from the control of Other Peoples. Its like Asian Indians gained independence by rising up against British overlords and expelling them. The Vietnamese gained national liberation by resisting French Colonialism and then American Neo-Imperialism. And it was as a collective that Russians pushed back against Napoleonic France in the 19th century and Nazi Germany in the 20th century. Freedom for the Motherland couldnt have been won by Russians as individual wolves. They had to cooperate and fight as Russian dogs in defense of the Order.

While ideally the freedom of we-the-people should expand the freedom of me-the-person within the Order, it hasnt always been so. Textbook examples are Tokugawa Japan, Red China, Castros Cuba, Islamic Iran, and North Korea. Though politically independent and relatively free of foreign influence, their suppression of me-the-person either intensified or hardly eased despite the autonomy. The reason was either for the survival of the Order or survival of the elites. In certain cases, the Order had to suppress considerations of me-the-person because it was under threat and at a great political-economic-military disadvantage. After all, patriotism and willingness to die were essential among the Vietnamese IF Americans were to be driven out. With excessive freedom of me-the-person, too many Viets might choose not to fight or even join with the other side as collaborators. In Sam Peckinpahs STRAW DOGS, David Sumner(Dustin Hoffman) decides he must force his wife to obey him if they are to defend the house from marauders. She is forbidden from collaborating with the Other side. She is forced to choose we-the-people over me-the-person despite her temptation otherwise.

Castros Cuba also had to be repressive in order to survive. As the US had so much more money, it could have bought off so many Cubans to do the bidding of US interests. Indeed, Cuba had essentially been a CIA-mafia-Jewish-run plantation/casino before Castro led an army of spartan patriots to take power. But, of course, the downside of repression in favor of we-the-people over me-the-person has been downright Orwellian. The system threw the baby out with the bathwater in its purge of turncoats, traitors, spies, and collaborators. Worse, over time, the invocation of we-the-people can become an excuse to perpetuate a system of we-the-elites. This is why a system has to find a balance between me-the-person and we-the-people. One thing for sure, history has shown time and time again that an order that is independent of foreign tyranny can be rife with domestic tyranny.

While all systems must maintain order with some degree of repression and control, some take this to extreme measures due to radical ideology, excessive paranoia, or just plain greed of rulers who stingily hog all the power and privilege. As profoundly different as North Korea and the US are in just about every way, if they have anything in common, its that both are ruled by elites who will do ANYTHING to maintain their supremacist or absolute grip on power. Even though North Korea seems like a fossilized hermit kingdom whereas the US seems a dynamic country constantly reinventing itself, both are essentially governed by the principle of elite-stasis. In other words, the reason why Jews are trying to make America so different is to keep same the power equilibrium, i.e. Jewish Supremacism must define American Power. As Jews are a minority-elite, they fear that stability in America will eventually lead to people realizing theyre ruled by Jews. For that reason, Jews stir up the impression of constant upheaval and transformation to misdirect the American Gaze from the one true constant in American Power Politics: JEWS RULE, JEWS GET RICHER, JEWS EXPAND THEIR CONTROLS.

Anyway, if civilization ordains that people must be servile dogs than defiant wolves, at the very least human-dogs can be ruled by their own kind than by another kind. In other words, English dogs should be ruled by English masters, Japanese dogs should be ruled by Japanese masters, German dogs should be ruled by German masters, Italian dogs should be ruled by Italian masters, Russian dogs should be ruled by Russian masters, Iranian dogs should be ruled by Iranian masters, Jewish dogs should be ruled by Jewish masters(though, to be sure, every Jew feels as a master than dog), and etc. After all, there is greater likelihood that master A will feel greater affection and sense of obligation for dogs A, and master B will for dogs B. Granted, it may not always be so. Master A could be cruel and abusive of Dogs A, and its possible Master B has more sympathy and heart for Dogs A. But generally, rulers of Nation A will have more feelings for the people of Nation A than for the peoples of Nation B, C, D, E, F, etc. Do Jewish rulers in Israel have more feelings for Jewish people or the Arab people, the Palestinians?

Now, one may point to white elites who seem to care just as much, if not more, for non-whites as for whites, but his anomaly is the result of Jewish conquest of the white mind/soul. Jews made it anathema among white elites to care about fellow whites because they want white elites to primarily serve and obey Jews. In other words, to convince white elites to favor the Jewish Other over the White Brother, Jews indoctrinated white elites(and even many among the white masses) that there are few things as evil in the world as whites caring for whites. Its NOT OKAY to be white. Another problem with elites of one nation excessively caring for other peoples than for their own is they will end up ill-serving both. After all, it is a full-time job to govern and take care of a nation. A national elite that tries to save the world as well as govern its own people is like a dog that loses the bone in his mouth for the one reflected in the water. Its like a parent who tries to take care of all the kids in the neighborhood. Hell just fail with all the children, including his own. Also, it makes the elites of other nations lazy and corrupt. Suppose if the elites of Nation B came to depend on elites of Nation A to provide food and aid for the people of Nation B. Why would the elites of Nation B clean up their own act when Nation A is providing Nation B with free stuff? And why would the people of Nation B try to replace the existing elites when they get by on handouts from Nation A?

While all of us must be more dogs than wolves within civilization, the ideal should be for the dogs and masters to be of the same identity. English masters for English dogs. That way, even if civilized man cannot be truly free and independent like a wild wolf, he can still be part of a people that are free and independent of rule by other peoples. The problem with the current West is that white folks are not only dogs of civilization a necessary condition for social order but dogs of a foreign master, the Jews. Worse, Jews are not even good masters over the Other. Jews look upon goyim as mere cattle, commodities, or cuck-dogs. The way Jews look upon goyim is far more contemptuous than how British Imperialists looked upon Hindus and Africans. At the very least, the Christian element of Western Civilization reminded whites that non-whites are also precious children of God. In contrast, Jews look upon goyim as barely human. Jews believe a single Jewish life is worth more than a million goy lives. Just Ask the Palestinians! Under Jewish rule, whites dont even have the freedom, pride, and power of We-the-People. Theyve been reduced to We-the-Cucks.

The black African threat to Europe makes things much worse. Blacks are barely domesticated as dogs; they are more like wild dogs, almost like wolves. As such, a sane West will do everything to protect European mild-willed dogs from African wild dogs. But three factors are forestalling this most necessary course of action. (1) Jewish globalist supremacists who control (((Western))) media and academia have elevated Negroes to god-like status. So many whites worship MLK and Mandela more than their own national/racial heroes, even over God and Jesus. And Jewish Power vilified racism as the worst of all sins, and racism is deemed most wicked when harboring negative feelings about blacks. Political Correctness demands that whites must love and honor blacks NO MATTER WHAT blacks do. (2) Even though blacks have thug supremacy over weaker whites and cause havoc in white nations, the fact remains Europe is rich while Africa is poor. Therefore, many Europeans still have this image of themselves as all-powerful and of blacks as helpless/harmless children. Thus, they fail to grasp the threat posed by black thugs on Western Civilization. (3) Even though civilization did wonders for non-black mankind, it also turned robust human-wolves into less impressive human-dogs. Though civilization can be maintained only by human-dogs, there is still the wolfish element in human-dogs that hankers for wolf-like glory and excitement. Because blacks are more impressive in sports, dancing, hollering, and fist-shaking, many white dogs are in state of awe of the wild black dawg that seems so badass.

The result is that the Current West not only favors mild-willed white dogs(those who go-along to get-along) over the strong-willed white dogs(those with the most survival instincts and fight/flight reflexes, problematic in peace time but essential in times of crisis) but also favors wild black dogs over strong-willed white dogs. This fatal alliance of mild-willed white dogs(and weak-willed white dogs) with wild black dogs against strong-willed white dogs will be the lethal formula that will bring down the West. In times of crisis, the strong-willed dogs must come to the fore to defend the order. In such times, the mild-willed dogs must look to the strong-willed dogs. (However, beware of the ultra-strong-willed dogs like Adolf Hitler. While Hitlers strong-will led Germany in its recovery of lost lands and resurgence in pride, he wasnt content with German affairs and embarked on wolf-attacks on OTHER nations to create a Greater Germanic Empire. This is why strong-will must be limited by Universal Nationalism respect other nations as you expect them to respect your nation and humanism that reminds people of their all-too-fragile humanity. Fascism elevated man to mythic hero while communism reduced man to a unit of History. In World War II, the German ubermensch rediscovered their humanity in defeat and humiliation. And the story of communism is the danger of sacrificing human lives as so many units in the service of History.)

View original post here:

Crisis of the Weak-of-Survival Being Favored over the Strong-of ...

Supremacism – Wikipedia

Ideology

Supremacism is the belief that a certain group of people is superior to all others.[1] The supposed superior people can be defined by age, gender, race, ethnicity, religion, sexual orientation, language, social class, ideology, nation, culture, or belong to any other part of a particular population.

Some feminist theorists[2] have argued that in patriarchy, a standard of male supremacism is enforced through a variety of cultural, political, religious, sexual, and interpersonal strategies.[2][3] Since the 19th century there have been a number of feminist movements opposed to male supremacism, usually aimed at achieving equal legal rights and protections for women in all cultural, political and interpersonal relations.[4][5][6]

Centuries of European colonialism in the Americas, Africa, Australia, Oceania, and Asia were justified by white supremacist attitudes.[7] White European Americans who participated in the slave industry tried to justify their economic exploitation of black people by creating a "scientific" theory of white superiority and black inferiority.[8] Thomas Jefferson, pioneer of scientific racism and enslaver of over 600 black people (regarded as property under the Articles of Confederation),[9] wrote that blacks were "inferior to the whites in the endowments of body and mind."[10] A justification for the conquest and subjugation of Native Americans emanated from their dehumanized perception as "merciless Indian savages", as described in the United States Declaration of Independence.[11][12]

During the 19th century, "The White Man's Burden", the phrase which refers to the thought that whites have the obligation to make the societies of the other peoples more 'civilized', was widely used to justify imperialist policies as a noble enterprise.[13][14] Thomas Carlyle, known for his historical account of the French Revolution, The French Revolution: A History, argued that European supremacist policies were justified on the grounds that they provided the greatest benefit to "inferior" native peoples.[15] However, even at the time of its publication in 1849, Carlyle's main work on the subject, the Occasional Discourse on the Negro Question, was poorly received by his contemporaries.[16]

Before the outbreak of the American Civil War, the Confederate States of America was founded with a constitution that contained clauses which restricted the government's ability to limit or interfere with the institution of "negro" slavery.[17] In the Cornerstone Speech, Confederate vice president Alexander Stephens declared that one of the Confederacy's foundational tenets was white supremacy over black slaves.[18] Following the war, a secret society, the Ku Klux Klan, was formed in the South. Its purpose was to maintain white, Protestant supremacy after the Reconstruction period, which it did so through violence and intimidation.[19]

According to William Nichols, religious antisemitism can be distinguished from modern antisemitism which is based on racial or ethnic grounds. "The dividing line was the possibility of effective conversion ... a Jew ceased to be a Jew upon baptism." However, with racial antisemitism, "Now the assimilated Jew was still a Jew, even after baptism ... . From the Enlightenment onward, it is no longer possible to draw clear lines of distinction between religious and racial forms of hostility towards Jews... Once Jews have been emancipated and secular thinking makes its appearance, without leaving behind the old Christian hostility towards Jews, the new term antisemitism becomes almost unavoidable, even before explicitly racist doctrines appear."[20]

One of the first typologies which was used to classify various human races was invented by Georges Vacher de Lapouge (18541936), a theoretician of eugenics, who published L'Aryen et son rle social (1899 "The Aryan and his social role") in 1899. In his book, he divides humanity into various, hierarchical races, starting with the highest race which is the "Aryan white race, dolichocephalic", and ending with the lowest race which is the "brachycephalic", "mediocre and inert" race, that race is best represented by Southern European, Catholic peasants".[21] Between these, Vacher de Lapouge identified the "Homo europaeus" (Teutonic, Protestant, etc.), the "Homo alpinus" (Auvergnat, Turkish, etc.), and finally the "Homo mediterraneus" (Neapolitan, Andalus, etc.) Jews were brachycephalic just like the Aryans were, according to Lapouge; but he considered them dangerous for this exact reason; they were the only group, he thought, which was threatening to displace the Aryan aristocracy.[22] Vacher de Lapouge became one of the leading inspirations of Nazi antisemitism and Nazi racist ideology.[23]

The Anti-Defamation League[24] (ADL) and Southern Poverty Law Center[25] condemn writings about "Jewish Supremacism" by Holocaust-denier, former Grand Wizard of the KKK, and conspiracy theorist David Duke as antisemitic in particular, his book Jewish Supremacism: My Awakening to the Jewish Question.[26] Kevin B. MacDonald, known for his theory of Judaism as a "group evolutionary strategy", has also been accused of being "antisemitic" and white supremacist in his writings on the subject by the ADL[27] and his own university psychology department.[28]

Cornel West, an African-American philosopher, writes that black supremacist religious views arose in America as a part of black Muslim theology in response to white supremacism.[29]

In Africa, black Southern Sudanese allege that they are being subjected to a racist form of Arab supremacy, which they equate with the historic white supremacism of South African apartheid.[30] The alleged genocide and ethnic cleansing in the ongoing War in Darfur has been described as an example of Arab racism.[31]For example, in their analysis of the sources of the conflict, Julie Flint and Alex de Waal say that Colonel Gaddafi, the leader of Libya, sponsored "Arab supremacism" across the Sahara during the 1970s. Gaddafi supported the "Islamic Legion" and the Sudanese opposition "National Front, including the Muslim Brothers and the Ansar, the Umma Party's military wing." Gaddafi tried to use such forces to annex Chad from 197981. Gaddafi supported the Sudanese government's war in the South during the early 1980s, and in return, he was allowed to use the Darfur region as a "back door to Chad". As a result, the first signs of an "Arab racist political platform" appeared in Darfur in the early 1980s.[32]

In Asia, ancient Indians considered all foreigners barbarians. The Muslim scholar Al-Biruni wrote that the Indians called foreigners impure.[33] A few centuries later, Dubois observes that "Hindus look upon Europeans as barbarians totally ignorant of all principles of honour and good breeding... In the eyes of a Hindu, a Pariah (outcaste) and a European are on the same level."[33] The Chinese considered the Europeans repulsive, ghost-like creatures, and they even considered them devils. Chinese writers also referred to foreigners as barbarians.[34]

From 1933 to 1945, Nazi Germany, under the rule of Adolf Hitler, promoted the idea of a superior, Aryan Herrenvolk, or master race. The state's propaganda advocated the belief that Germanic peoples, whom they called "Aryans", were a master race or a Herrenvolk whose members were superior to the Jews, Slavs, and Romani people, so-called "gypsies". Arthur de Gobineau, a French racial theorist and aristocrat, blamed the fall of the ancien rgime in France on racial intermixing, which he believed had destroyed the purity of the Nordic race. Gobineau's theories, which attracted a large and strong following in Germany, emphasized the existence of an irreconcilable polarity between Aryan and Jewish cultures.[35]

Academics Carol Lansing and Edward D. English argue that Christian supremacism was a motivation for the Crusades in the Holy Land, as well as crusades against Muslims and pagans throughout Europe.[36] The blood libel is a widespread European conspiracy theory which led to centuries of pogroms and massacres of European Jewish minorities because it alleged that Jews required the pure blood of a Christian child in order to make matzah for Passover; Thomas of Cantimpr writes of the blood curse which the Jews put upon themselves and all of their generations at the court of Pontius Pilate where Jesus was handed a death sentence: "A very learned Jew, who in our day has been converted to the (Christian) faith, informs us that one enjoying the reputation of a prophet among them, toward the close of his life, made the following prediction: 'Be assured that relief from this secret ailment, to which you are exposed, can only be obtained through Christian blood ("solo sanguine Christiano")."[37] The Atlantic slave trade has also been partially attributed to Christian supremacism.[38] The Ku Klux Klan has been described as a white supremacist Christian organization, as are many other white supremacist groups, such as the Posse Comitatus and the Christian Identity and Positive Christianity movements.[39][40]

Academics Khaled Abou El Fadl, Ian Lague, and Joshua Cone note that, while the Quran and other Islamic scriptures express tolerant beliefs, there have also been numerous instances of Muslim or Islamic supremacism.[41] Examples of how supremacists have interpreted Islam include the Muslim participation in the African slave trade, the early-20th-century pan-Islamism promoted by Abdul Hamid II,[42] the jizya and rules of marriage in Muslim countries being imposed on non-Muslims,[43] and the majority Muslim interpretations of the rules of pluralism in Malaysia. According to scholar Bernard Lewis, classical Islamic jurisprudence imposes an open-ended duty on Muslims to expand Muslim rule and Islamic law to all non-Muslims throughout the world.[44]

North Africa has witnessed numerous incidents of massacres and ethnic cleansing of Jews and Christians,[45] especially in Morocco, Libya, and Algeria, where eventually Jews were forced to live in ghettos.[46] Decrees ordering the destruction of synagogues were enacted during the Middle Ages in Egypt, Syria, Iraq, and Yemen.[47] At certain times in Yemen, Morocco, and Baghdad, Jews were forced to convert to Islam or face the Islamic death penalty.[48] While there were antisemitic incidents before the 20th century, antisemitism increased dramatically as a result of the ArabIsraeli conflict. After the 1948 ArabIsraeli War, the Palestinian exodus, the creation of the State of Israel and Israeli victories during the wars of 1956 and 1967 were a severe humiliation to Israel's opponents primarily Egypt, Syria, and Iraq.[49] However, by the mid-1970s the vast majority of Jews had left Muslim-majority countries, moving primarily to Israel, France, and the United States.[50] The reasons for the Jewish exodus are varied and disputed.[50]

Ilan Papp, an expatriate Israeli historian, writes that the First Aliyah to Israel "established a society based on Jewish supremacy".[51] Joseph Massad, a professor of Arab studies, holds that "Jewish supremacism" has always been a "dominating principle" in religious and secular Zionism.[52][53] Zionism was established with the political goal of creating a sovereign Jewish state where Jews could be the majority, rather than the minority which they were in all nations of the world at that time. Theodor Herzl, the ideological father of Zionism, considered antisemitism to be an eternal feature of all societies in which Jews lived as minorities, and as a result, he believed that only a separation could allow Jews to escape eternal persecution. "Let them give us sovereignty over a piece of the Earth's surface, just sufficient for the needs of our people, then we will do the rest!"[54]

Since the 1990s,[55][56] Orthodox Jewish rabbis from Israel, most notably those affiliated to Chabad-Lubavitch and religious Zionist organizations,[55][56][57] including The Temple Institute,[55][56][57] have set up a modern Noahide movement to proselytize among non-Jews (usually referred to as "Gentiles" or goyim).[55][56][57] These Noahide organizations, led by religious Zionist and Orthodox rabbis, are aimed at non-Jews in order to proselytize among them and commit them to follow the Noahide laws.[55][56][57] However, these religious Zionist and Orthodox rabbis that guide the modern Noahide movement, who are often affiliated with the Third Temple movement,[55][56][57] expound a racist and supremacist ideology which consists in the belief that the Jewish people are God's chosen nation and racially superior to non-Jews,[55][56][57] and mentor Noahides because they believe that the Messianic era will begin with the rebuilding of the Third Temple on the Temple Mount in Jerusalem to re-institute the Jewish priesthood along with the practice of ritual sacrifices, and the establishment of a Jewish theocracy in Israel, supported by communities of Noahides.[55][56][57] David Novak, professor of Jewish theology and ethics at the University of Toronto, has denounced the modern Noahide movement by stating that "If Jews are telling Gentiles what to do, its a form of imperialism".[58]

In the aftermath of the 2022 Israeli legislative election, the winning right-wing coalition included an alliance known as Religious Zionist Partya grouping of the Religious Zionist, Otzma Yehudit, and Noam parties.[59] Within the context of the 20192022 Israeli political crisis, this was the fifth legislative election in nearly four years, as no party since 2019 had been able to form a stable coalition.[60][61] Jewish-American columnist David E. Rosenberg has stated that the Religious Zionist Party's "platform includes things like annexation of West Bank settlements, expulsion of asylum-seekers, and political control of the judicial system".[59] He further described the Religious Zionist Party as a political party "driven by Jewish supremacy and anti-Arab racism".[59]

Go here to read the rest:

Supremacism - Wikipedia