Ilya Sutskever Launches Safe Superintelligence Inc. to Build Safe and Powerful AI Systems

Palo Alto, California United States of America
Concerns about the safety and regulation of AI technology have been growing in the tech industry
Ilya Sutskever launched Safe Superintelligence Inc. (SSI)
Offices will be located in Palo Alto, California, and Tel Aviv, Israel
SSI aims to build safe and powerful AI systems
SSI will be co-founded by Daniel Gross and Daniel Levy
Sutskever was involved in an attempt to oust OpenAI CEO Sam Altman last year
Ilya Sutskever Launches Safe Superintelligence Inc. to Build Safe and Powerful AI Systems

The tech world is abuzz with the latest developments in artificial intelligence (AI) as Ilya Sutskever, co-founder and former chief scientist of OpenAI, announced the launch of his new company, Safe Superintelligence Inc. (SSI). The new venture aims to build safe and powerful AI systems that could potentially become a rival to OpenAI.

Sutskever's departure from OpenAI was not amicable as he was involved in an attempt to oust CEO Sam Altman last year. Concerns about the safety and regulation of AI technology have been growing in the tech industry, making SSI's focus on safe superintelligence a timely and important endeavor.

SSI will be co-founded by Daniel Gross, who oversaw Apple's AI and search efforts, and Daniel Levy, formerly of OpenAI. The company plans to have offices in Palo Alto, California, and Tel Aviv, Israel.

Sutskever's new venture comes after he left OpenAI last month to work on a project that is personally meaningful to him. With the increasing importance of AI in our daily lives and its potential impact on society as a whole, SSI's mission is crucial for ensuring the safe development and implementation of advanced AI systems.

The tech industry has seen significant advancements in generative AI, with OpenAI's ChatGPT being a notable example. However, these developments also raise concerns about the potential risks associated with superintelligent AI systems. SSI's focus on safety will be essential for addressing these concerns and ensuring that the benefits of advanced AI are realized without causing harm.

The launch of SSI marks an exciting new chapter in the world of artificial intelligence, as we continue to explore its potential and navigate the challenges it presents. Stay tuned for more updates on this developing story.



Confidence

91%

Doubts
  • Was there any official statement from OpenAI regarding Ilya Sutskever's departure?

Sources

97%

  • Unique Points
    • Ilya Sutskever, co-founder of OpenAI and its former chief scientist, has started a new company called Safe Superintelligence.
    • Safe Superintelligence aims to build A.I. technologies that are smarter than humans but not dangerous.
    • Last year, Ilya Sutskever helped create a team inside OpenAI to ensure that A.I. technologies would not do harm.
  • Accuracy
    • Ilya Sutskever, co-founder of OpenAI, announced the launch of Safe Superintelligence Inc.
    • Safe Superintelligence Inc. is focused on building safe and powerful artificial intelligence as a rival to OpenAI
    • SSI's goal is to create a safe and powerful AI system.
    • SSI will have offices in Palo Alto, California, and Tel Aviv, Israel.
  • Deception (100%)
    None Found At Time Of Publication
  • Fallacies (100%)
    None Found At Time Of Publication
  • Bias (100%)
    None Found At Time Of Publication
  • Site Conflicts Of Interest (100%)
    None Found At Time Of Publication
  • Author Conflicts Of Interest (100%)
    None Found At Time Of Publication

100%

  • Unique Points
    • Ilya Sutskever, co-founder of OpenAI, announced the launch of Safe Superintelligence Inc.
    • Safe Superintelligence Inc. is focused on building safe and powerful artificial intelligence as a rival to OpenAI
    • Sutskever was involved in an attempt to oust OpenAI’s CEO Sam Altman last year
    • Concerns about the safety and regulation of AI technology are growing in the tech world
    • Sutskever left OpenAI last month to work on a project that is personally meaningful to him
  • Accuracy
    No Contradictions at Time Of Publication
  • Deception (100%)
    None Found At Time Of Publication
  • Fallacies (100%)
    None Found At Time Of Publication
  • Bias (100%)
    None Found At Time Of Publication
  • Site Conflicts Of Interest (100%)
    None Found At Time Of Publication
  • Author Conflicts Of Interest (100%)
    None Found At Time Of Publication

98%

  • Unique Points
    • Ilya Sutskever, co-founder and former chief scientist of OpenAI, is starting a new AI company named Safe Superintelligence Inc. (SSI).
    • SSI's goal is to create a safe and powerful AI system.
    • Safe Superintelligence Inc. is co-founded by Daniel Gross, a former AI lead at Apple, and Daniel Levy, who previously worked as a member of technical staff at OpenAI.
    • SSI's first product will be safe superintelligence and the company will not do anything else until then.
  • Accuracy
    • The company approaches safety and capabilities in tandem to quickly advance its AI system while prioritizing safety.
    • Last year, Ilya Sutskever led the push to oust OpenAI CEO Sam Altman and left OpenAI in May.
  • Deception (100%)
    None Found At Time Of Publication
  • Fallacies (100%)
    None Found At Time Of Publication
  • Bias (100%)
    None Found At Time Of Publication
  • Site Conflicts Of Interest (100%)
    None Found At Time Of Publication
  • Author Conflicts Of Interest (100%)
    None Found At Time Of Publication

99%

  • Unique Points
    • Ilya Sutskever, co-founder and chief scientist of OpenAI, announced the launch of his new AI startup named Safe Superintelligence (SSI).
    • Sutskever left OpenAI in May 2023 and will focus solely on safe superintelligence at SSI.
    • The company will have offices in Palo Alto, California, and Tel Aviv, Israel.
    • Daniel Gross and Daniel Levy are joining SSI from Apple and OpenAI respectively.
    • Sutskever was one of the board members behind the attempt to oust Sam Altman as CEO of OpenAI in November 2023.
  • Accuracy
    • Ilya Sutskever, co-founder and former chief scientist of OpenAI, is starting a new AI company named Safe Superintelligence Inc. (SSI).
    • Safe Superintelligence Inc. is focused on building safe and powerful artificial intelligence as a rival to OpenAI
    • SSI has a singular focus, allowing it to avoid distraction by management overhead or product cycles.
    • SSI was founded by Sutskever along with former Y Combinator partner Daniel Gross and ex-OpenAI engineer Daniel Levy.
    • Both Sutskever and Jan Leike, who co-led OpenAI’s Superalignment team, left the company in May 2024 after a disagreement over the approach to AI safety.
    • SSI is focused on developing safe superintelligence as its primary and only objective, aiming to create an AI that is both powerful and secure.
  • Deception (100%)
    None Found At Time Of Publication
  • Fallacies (100%)
    None Found At Time Of Publication
  • Bias (100%)
    None Found At Time Of Publication
  • Site Conflicts Of Interest (100%)
    None Found At Time Of Publication
  • Author Conflicts Of Interest (100%)
    None Found At Time Of Publication

99%

  • Unique Points
    • Ilya Sutskever, a co-founder of OpenAI and its former chief scientist, has launched a new AI company called Safe Superintelligence Inc. (SSI).
    • SSI was founded by Sutskever along with former Y Combinator partner Daniel Gross and ex-OpenAI engineer Daniel Levy.
    • Both Sutskever and Jan Leike, who co-led OpenAI’s Superalignment team, left the company in May 2024 after a disagreement over the approach to AI safety.
    • SSI is focused on developing safe superintelligence as its primary and only objective, aiming to create an AI that is both powerful and secure.
    • Unlike OpenAI, which started as a non-profit organization and later restructured due to financial needs, SSI is being established as a for-profit entity from the beginning.
  • Accuracy
    No Contradictions at Time Of Publication
  • Deception (100%)
    None Found At Time Of Publication
  • Fallacies (95%)
    The author provides a clear and factual account of Ilya Sutskever's departure from OpenAI and the formation of his new company, Safe Superintelligence Inc. (SSI). There are no explicit fallacies found in the article. However, there is an appeal to authority when the author quotes Daniel Gross stating that 'raising capital is not going to be one of them.' This does not significantly impact the score as it is a minor infraction.
    • 'raising capital is not going to be one of them.', - Daniel Gross, Bloomberg
  • Bias (100%)
    None Found At Time Of Publication
  • Site Conflicts Of Interest (100%)
    None Found At Time Of Publication
  • Author Conflicts Of Interest (100%)
    None Found At Time Of Publication