Language
AdminTech

AI, Law, and Responsibility: Understanding OpenAI’s Legal Advice Ban

OpenAI has updated its usage policies, adding a new restriction from of October 29, 2025. What does that really mean for individuals and businesses? 
06.11.2025

AI, Law, and Responsibility: Understanding OpenAI’s Legal Advice Ban 

It’s already become a usual thing — whenever something happens or we need quick information, we turn to ChatGPT and ask about it. However, as of October 29, 2025, OpenAI has updated its usage policies, adding a new restriction: 

    “You cannot use our services for the provision of tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional.” 

What does that really mean for individuals and businesses? 

OpenAI Has Drawn a Line of Responsibility for Giving Legal Advice 

This update marks a major step in OpenAI’s approach to responsible AI use. In practice, OpenAI is drawing a clear line between AI-powered information and professional advice. 

While ChatGPT can still help you understand legal concepts or summarise regulations, it does not take responsibility for providing personalised legal recommendations that could influence real-world decisions or carry legal consequences. 

Why This Happened 

OpenAI’s decision is not a sudden shift but rather a necessary response to the growing use — and potential misuse — of AI in sensitive, regulated fields. Several key factors explain why this change became essential. 

Legal liability 
AI models could unintentionally give incorrect or incomplete legal advice, potentially leading to financial or reputational harm. OpenAI doesn’t want to be involved in legal disputes or user claims that stem from following AI-generated advice. That’s why a very practical decision from their side was to draw a clear boundary between providing general information and giving professional advice. 

No one behind the answer 
AI uses large language models trained on enormous datasets — it doesn’t have any human oversight behind each response. So, when it “hallucinates” (i.e., invents information or misinterprets facts), there’s no accountable professional who can verify or correct the answer before it reaches the user. 

Inconsistent responses 
You can ask the AI the same question in two different chats, and it might give you completely different answers — one of which could be correct, another potentially misleading or even legally risky. This inconsistency makes it impossible to rely on AI for advice that carries legal consequences. 

Can ChatGPT Still Help You with Legal Questions? 

In reality, you can still ask ChatGPT questions regarding legal topics such as to explain common legal terms, analyse the structure of contracts, or summarise the new regulation in easy words. ChatGPT can explain, outline, or clarify — but it will not tell you what to do in your specific situation. 

For example, imagine you’re preparing to rent an office space. You might ask: 

“What are common terms in a lease agreement for commercial property?” 

ChatGPT will provide a clear answer, outlining clauses like rent amount, deposit, termination, and maintenance responsibilities. However, if you go further and ask it to review your specific office lease and advise whether to sign it, the AI warns you that it’s not a lawyer and can’t give you legal advice or tell you whether you should sign the contract. 

Let’s go a bit deeper and look at the kinds of legal situations where ChatGPT can technically provide an answer — but does not take responsibility for the accuracy or consequences of that answer, making it clear that consulting a lawyer is still the safer choice. 

Personalised Legal Strategy or Recommendations 

ChatGPT can formulate plausible-sounding advice, but it cannot stand behind any specific recommendation in a legal dispute or business decision. It does not take responsibility for telling you whether to sue, sign, or terminate — nor for the potential consequences of doing so. 

It might provide general context on what such decisions typically involve, but choosing the right course of action in your situation requires legal judgment, evidence review, and professional accountability — things only a licensed lawyer can provide. 

Interpretation of Law for Your Specific Situation 

ChatGPT can explain what the law says, but it does not take responsibility for interpreting how the law applies to you. 

For instance, you could ask whether a non-disclosure agreement is enforceable in your canton, whether a customer is entitled to a refund, or whether using a specific image would violate copyright. ChatGPT can explain general principles, but its response is not a legal opinion and carries no liability if it turns out to be wrong. 

Drafting or Reviewing Binding Legal Documents 

ChatGPT can help you draft or format a document and even identify common clauses in a contract. What it will not do is take responsibility for whether that document is valid, enforceable, or suited to your particular business and jurisdiction. 

It can produce a solid draft, but that draft should always be reviewed by a qualified professional who can verify compliance with applicable law and ensure your interests are protected. 

Tax, Financial, or Regulatory Advice 

ChatGPT can outline rules, explain general compliance frameworks, or describe regulatory standards such as the Swiss Data Protection Act (nFADP) or financial regulations under FINMA. However, it does not take responsibility for confirming whether your business actually complies with those requirements. 

Taxation, data protection, and financial supervision in Switzerland often depend on specific business models, cantonal rules, and industry classifications — details that require professional analysis. For that reason, while ChatGPT can help you understand the principles, only a qualified advisor can confirm compliance in practice. 

AI and Accountability: The Next Step in Digital Maturity 

If you go to a lawyer and they give you legal advice that turns out to be wrong or causes financial loss, you can hold that professional accountable. Lawyers operate under licensing, professional ethics, and legal liability — their advice comes with responsibility. 

What about ChatGPT? Until now, this line was blurred. Many people used AI-generated responses as if they came from a qualified expert. However, with the new update, OpenAI has made that boundary clear: ChatGPT is an informational tool, not a legal advisor. It can support your understanding, but it cannot assume responsibility for outcomes that follow from its answers. 

That’s the real turning point: as generic legal AI such as ChatGPT reaches its limits, the future belongs to specialised, compliant, and accountable AI solutions — one of which is AI Lawyer Amy. 

Unlike general-purpose models that don’t take responsibility for the accuracy or legal validity of their output, Amy is built by Swiss lawyers specifically for the Swiss legal environment. Every legal document created with her is backed by a legal guarantee of up to CHF 25,000, meaning that we at AdminTech take full responsibility for the accuracy, compliance, and enforceability of what Amy produces. 

During an interactive consultation, Amy analyses your situation, explains legal terms you might find unclear, and drafts your contract just like a professional legal advisor — with the confidence of human oversight and the speed of AI. 

Discover how AI Lawyer Amy can automate your business’s legal document creation and ensure every contract is accurate, compliant, and legally enforceable