At last week’s SXSW conference, prominent transhumanist Eliezer Yudkowsky said that if the development of artificial general intelligence is not stopped immediately across the globe, humanity may be destroyed.
“We must stop everything,” Yudkowsky said during a panel titled “How to Make AGI (Artificial General Intelligence) Not Kill Everyone.”
“We are not ready,” he continued. “We do not have the technological capability to design a superintelligent AI that is polite, obedient and aligned with human intentions – and we are nowhere close to achieving that.”
Yudkowsky, founder of the Machine Intelligence Research Institute, has made similar comments in recent years, repeatedly warning that humanity must cease all work on AGI or face human extinction.
In a 2023 article in Time magazine, Yudkowsky said that no current AGI project had a feasible plan to align AGI with the interests of humanity. […]
— Read More: allisrael.com
What Would You Do If Pharmacies Couldn’t Provide You With Crucial Medications or Antibiotics?
The medication supply chain from China and India is more fragile than ever since Covid. The US is not equipped to handle our pharmaceutical needs. We’ve already seen shortages with antibiotics and other medications in recent months and pharmaceutical challenges are becoming more frequent today.
Our partners at Jase Medical offer a simple solution for Americans to be prepared in case things go south. Their “Jase Case” gives Americans emergency antibiotics they can store away while their “Jase Daily” offers a wide array of prescription drugs to treat the ailments most common to Americans.
They do this through a process that embraces medical freedom. Their secure online form allows board-certified physicians to prescribe the needed drugs. They are then delivered directly to the customer from their pharmacy network. The physicians are available to answer treatment related questions.