EU AI Regulation (Tim Martin featured in Forbes)

Share This Post

Tim Martin, Executive VP of Product & Development at Yseop was recently featured in Forbes discussing AI regulation and the EU AI Act. For more context, the EU AI Act is the first law on AI by a major regulator and is aiming to reach a final agreement by the end of this year. To mitigate potential harmful effects of AI, restrictions are focused on protecting health, safety, fundamental rights and prioritizing human-centric and trustworthy AI. 

The Discussion of AI Regulation

Discussion surrounding AI regulation has been in development for several years, but the emergence of tools such as ChatGPT has further intensified concerns. Further, the act proposes that the provider of a foundation model must take reasonable steps to reduce risks of bias. If all reasonable steps to mitigate risks are taken, the provider would not be liable even if the harm still occurred. With that, Tim believes that companies with proprietary foundation models may need to reveal more than they have about their training data. “The regulation is saying, ‘you need to be more transparent about what you’re doing… if you don’t tell us anything, we have no way to evaluate the impact on citizens.” 

Further details on the AI Act are still being discussed. We can anticipate more news to continue to circulate on this topic and will stay tuned for the final impact. 

Yseop’s Generative AI 

Yseop is a generative AI company that’s working with companies in highly-regulated industries. Recently, Yseop launched Yseop Copilot, a digital colleague for medical writers that leverages pre-trained generative AI models (LLMs) specifically suited for the BioPharma industry. Through Yseop Copilot, medical writers across hundreds of clinical trials have cut writing times significantly. Yseop’s products are designed to be copilots to increase human efficiency and make work more enjoyable, productive and fair. 

To learn more about Yseop’s generative AI technology and how it can accelerate your business, please contact

Scroll to Top