Artificial Intelligence Tools for Law Firms – Solutions and Applications
Artificial Intelligence (AI) is rapidly spreading in the legal profession and is increasingly becoming part of everyday work. Recent surveys show that 62% of lawyers use some form of generative AI tool at least weekly, and 51% rely on it daily. Below, we review the most suitable AI solutions currently available for law firms and legal advisors – presented in a professional, objective manner, highlighting which tool is best suited for which legal task. We cover case law and statutory research, contract drafting and document preparation, contract due diligence, office administration automation, and AI chatbots used in client communication. In each category, we mention specific products and services – such as Harvey AI, Casetext CoCounsel, Spellbook, Luminance, Kira, LawGeex, Ironclad, DoNotPay, Lega, Henchman, Microsoft 365 Copilot – along with brief descriptions, availability, and practical implementation tips.
Legal Research and Precedent Search with AI
Legal research – searching for judgments, statutes, and precedents – is traditionally a time-consuming task. New generative AI tools also provide support in this area for lawyers: language models can scan vast amounts of text in moments and answer our natural language questions based on legal sources.
Casetext CoCounsel – now a Thomson Reuters product – is a comprehensive AI assistant that combines statutory and case law research, drafting/documents writing, and document analysis on one platformt. CoCounsel is built on the OpenAI GPT-4 model but is further trained with legal data (e.g., case decisions, explanations) and is thus capable of generating answers to complex legal questions. For example, with CoCounsel, we can search for precedents related to a specific case, have a summary prepared from a judicial decision, or even generate a legal memorandum that references relevant case law. Thomson Reuters has integrated CoCounsel with Westlaw and Practical Law content, so the system works with “official” databases, and the generated responses often include source references (e.g., court decisions, statutory provisions) for reliability. However, it is important to highlight that CoCounsel is a professional tool, primarily working with English legal materials, and its cost is significant: according to an independent review, the starting fee is about 225 USD per user per month, making it more worthwhile for larger firms or international practices to invest.
Harvey AI is another promising AI platform specifically tailored for legal work. Harvey is also built on OpenAI technology (ChatGPT/GPT-4), which has been further trained on a vast amount of legal text. The system became known within the PwC global network after they entered into an exclusive collaboration with the developers. The uniqueness of Harvey is that during implementation, it is customized with the knowledge of the specific law firm: it is trained on the firm's previous cases, document templates, and internal knowledge base, thus “onboarding” it similarly to a new lawyer colleague. As a result, it can work in many languages and legal systems, as the model has been trained on multilingual legal corpora (so it may be usable to some extent in international cases, even with Hungarian documents). Harvey is a general-purpose legal AI assistant that assists in contract drafting, contract analysis, legal research, fact-finding, and compliance checks, making it very versatile. According to PwC Hungary's press release, the Harvey platform automates and supports various areas of legal work with the combination of natural language processing, machine learning, and data analysis, helping to understand connections and develop recommendations by processing large amounts of data – but the results are, of course, always reviewed by experts. Harvey is currently mainly available to large international firms (e.g., Big4 companies), but the developers announced in 2025 that they are working on more affordable, commercial versions for smaller firms as well.
Of course, other legal research AI solutions are also taking shape: LexisNexis is integrating AI into its Lexis+ platform (e.g., under the name Lexis+ AI), and the Spanish–international vLex system's Vincent AI assistant can also search multilingual legal databases. The common feature of these solutions is that they shorten research time and summarize the essentials – but the final result is always filtered by the lawyer. In practice, these tools are useful in that the lawyer quickly gets a starting point (e.g., a relevant case list or an analytical summary), which they then verify and supplement with their expertise. Since most Hungarian legal materials and precedents are not yet part of the large language models' knowledge, Hungarian lawyers can mainly use these AI tools for Anglo-Saxon legal research or exploring EU legal materials. However, it is expected that localized content (even Hungarian court decisions) will be included over time – or domestic providers will offer their own AI-based search engines for this purpose.
Contract Drafting and Document Generation with AI
AI also provides significant assistance in writing and editing contracts and legal documents, especially generative text models. These can create complete contract drafts based on templates or suggest wording for specific clauses. However, it is important that these are always reviewed professionally – AI does not replace careful legal work but can speed it up.
Spellbook – a product of Rally – was the first generative AI “copilot” specifically for lawyers, and today it is one of the most comprehensive AI toolkits for transactional lawyers. Spellbook works as a Microsoft Word add-in, meaning it provides suggestions and checks directly in the Word text. With its help, lawyers can edit and review contracts up to ten times faster. What does Spellbook specifically offer? On the one hand, it generates full text drafts from contract templates or a few parameters (“Draft” function), and on the other hand, it indicates potentially risky clauses and suggests corrections (“Review” mode) – all in Word, in the form of red edits. Additionally, it can be asked questions about the text (“Ask”), or it can compare the contract with industry standards (“Benchmarks”). For example, if we are writing an NDA, Spellbook can alert us if a usual confidentiality definition is missing or if a non-compete clause is too general – and it can offer alternative text sections. The system, with its own Clause Library function, can also import clauses saved from previous contracts. It is important that Spellbook places great emphasis on data protection: according to their commitment, they operate on a Zero Data Retention principle, meaning they do not retain uploaded documents and do not use them for further model training. Spellbook thus tries to eliminate the risk of confidential legal data leaking into the cloud. The tool is currently mainly strong in English (and trained on common law contracts), but since it can technically read text written in any language, it can be applied in other languages with appropriate training. Its pricing is flexible: it offers a 7-day free trial, after which it charges a subscription fee depending on team size (based on individual agreements). Market information suggests that full-featured Spellbook access costs about 150–180 USD per month per lawyer – a cost that larger law firms can more easily cover, while smaller practices typically purchase a few licenses for the most common contractual tasks.
Henchman is a similarly practical tool for contract drafting, especially if the firm already has a significant precedent collection. Henchman is also a Word add-in that automatically connects to the legal team's document repository and mines previously written clauses from there. Essentially, it is a smart “template retriever”: if the lawyer wants to insert a certain type of clause, Henchman shows how it appeared in previous contracts in moments. We can search for specific text fragments, browse for inspiration among similar clauses, or compare variations across multiple documents. This saves the lawyer time and increases consistency, as they can reuse proven, verified texts. Henchman's great advantage is that it is system- and language-independent – it can extract and handle texts from contracts in any language and format, as it works on our own database. In 2023, LexisNexis entered into a strategic partnership with Henchman's developers and made the function available as part of the Lexis Create+ document editor. This means, for example, that Lexis+ users get Henchman's clause mining capability integrated into Word, complemented by legal research in the Lexis database. Henchman (or Lexis Create+) strongly supports the Word-based workflow and provides precedent-based editing and quick research links. It is ideal for firms that already have many of their own contract templates and want to reuse them consistently and quickly when creating new documents.
In a corporate environment where contract creation is closely linked to contract management, Contract Lifecycle Management (CLM) platforms like Ironclad offer built-in AI support. The Ironclad AI Assist function works in both the browser and Word: the system uses templates and questionnaires to generate initial contract drafts, then automatically flags deviations and suggests modifications based on the playbook (legal guideline). Since Ironclad CLM covers the entire process from generation to approval, the AI's suggestions integrate directly into the workflow – for example, the system immediately implements the legal team's defined rules (standard clauses, alternatives) in all new contracts. This is extremely efficient for high-volume, business-like contracting (e.g., standardizing all supplier contracts of a company). Its disadvantage is that Ironclad AI focuses less on sophisticated linguistic reasoning or external legal research references – it primarily serves internal consistency and scalability, so the suggested changes do not always come with explanatory text or legal reasoning. Ironclad targets large corporate clients who are already using Ironclad CLM; for them, the drafting AI is an additional module in the system. Its pricing is based on individual agreements, typically an annual subscription to the platform – quite costly, but many companies choose it for its “end-to-end” capability if they want to digitize every element of the contractual process.
Of course, generative AIs can also assist in contract writing in simpler forms. Many lawyers experiment with general models like OpenAI ChatGPT or Azure/OpenAI-based Copilots to create a first draft of a document. For example, a simple contract can be drafted roughly with ChatGPT, or ideas can be requested for formulating certain parts of a complex submission. These are free or low-cost alternatives, but there is a high risk of inaccuracy and data privacy concerns (see warnings below). The Microsoft 365 Copilot – discussed in a separate chapter – can also, for example, draft a contract or letter in Word based on a given point list. However, in all these cases, thorough legal review is crucial: generative AI may “hallucinate” (cite non-existent laws, mix concepts), and the output must be adjusted to the desired style. Overall, in contract writing, AI can be seen as a junior colleague who gathers templates, suggests text, but the final touches and responsibility remain with the lawyer.
Contract Due Diligence and Review (AI Due Diligence)
Contract due diligence and auditing – such as reviewing hundreds of contracts during a company acquisition or compliance checking a large client's contract portfolio – is typically an area where AI can bring significant efficiency gains. Specialized AI software can quickly identify risky clauses, gaps, and divergent wording in large document sets and even prepare summary reports of findings. Such tools have been used by larger law firms for some time – e.g., machine learning-based clause extractors – but the latest generation combines analysis with generative models, offering more interactive and intelligent features.
Luminance – a platform developed by Cambridge AI experts – is considered a pioneer in due diligence AI. Luminance originally gained fame for its pattern recognition algorithms that found “anomalies” in sale contract packages, showing which documents deviate from usual clauses (e.g., where a customary warranty is missing or where there is an unusually long termination period). Today, Luminance has grown into a comprehensive contract management AI system: it offers contract generation assistance, negotiation support, an intelligent document repository, and chatbot. Luminance operates language-independently, deployable in multiple languages in global projects, and is used by over 700 organizations worldwide, mainly law firms and corporate legal departments. The system's “Panel of Judges” technology combines many different models to evaluate text from various perspectives and derive the final result from their consensus, ensuring “lawyer-level” accuracy. Practical functions: Luminance works like an AI-based contract checker, instantly analyzing any loaded contract and visually indicating which points deviate from previously accepted texts – it's almost like having a “legal spell checker” in the document. With one click, we can even align a third-party contract with the gold standard, as the system knows our company's preferred text and automatically inserts it instead. If something does not comply with internal policy, the Word sidebar immediately offers pre-approved precedent text or fallback solutions that can be inserted with a button press. Luminance also builds an intelligent contract repository: it extracts key data from all uploaded or processed contracts (recognizing more than 1000 types of legal concepts, e.g., contract duration, included clauses, governing law, etc.). So, we can ask at any time: for example, “do we have any active contracts that stipulate New York law and remain in effect after 2025?”, and the system lists them in moments. Luminance's built-in chatbot, “Ask Lumi”, allows us to ask questions about the documents in natural language and get immediate answers. For example, we can ask: “Summarize this 30-page supplier contract in 5 points” – Lumi creates a concise summary in seconds. The same chatbot can also be instructed to rewrite a specific clause: e.g., “Suggest a liability limitation clause that meets our standard but is an acceptable compromise compared to the partner's proposal” – Luminance generates a modified text that is a middle ground between the two versions. These functions are particularly useful in cross-border transactions, where materials from multiple legal systems and languages often need to be viewed with a unified perspective: Luminance is jurisdiction-sensitive (considers which country's law the contract was made under) and also performs compliance mapping – for example, it immediately indicates GDPR issues if data protection provisions are missing. It should be noted that implementing Luminance is not a trivial task: full utilization requires thorough training and customization, and the software itself is a significant investment (the manufacturer applies a custom pricing model depending on the project size). Additionally, it is currently not natively integrated with Microsoft 365 (runs as a separate application, although it has a Word/Outlook plugin), which may require some getting used to. It is primarily recommended for large international law firms and companies with complex, multilingual contract portfolios, where anomaly detection and deep AI analysis can be leveraged.
Kira Systems – now part of Litera – is among the oldest players in machine learning-based contract analysis. Kira has been used for years in M&A due diligence: its machine learning-trained modules recognize hundreds of types of contract clauses in uploaded documents and extract their content (e.g., lease agreements' rent increase clauses, loan agreements' covenants, etc.). Kira's strength is accurate text extraction in large volumes – it can review thousands of contracts and compile a table of what each contains. This is a huge help, for example, during a company audit: the AI completes the junior lawyers' days-long checking in a few hours, and the lawyer can focus on the extracted data. However, Kira is not specialized in redlining or suggesting text corrections, and it does not have built-in anonymization – it is more of a “raw” analytical tool, requiring lawyers to manually interpret the extracted information. Litera recently announced the Kira + GenAI development, indicating that they are integrating the generative model with Kira's knowledge – presumably meaning that the data recognized by Kira will be used by an AI chatbot for Q&A purposes. Kira is typically a tool for larger law firms and M&A teams, where routine review of many identical contracts is needed. Its pricing is license-based, generally with an annual fee – considered a mid-range AI solution in terms of cost.
LawGeex is another market-leading AI for automating contract review, particularly tailored to corporate legal advisors' needs. LawGeex is known for being one of the first AI startups to offer automatic approval of NDAs and other common contracts, faster and cheaper than lawyers. The platform's essence is that we upload an incoming contract draft (e.g., a non-disclosure agreement from a business partner), and LawGeex returns a version filled with suggested edits (redline) within minutes, aligning with our company's internal rules. So, it not only highlights problematic parts but specifically rewrites them according to the given policy – as if an experienced lawyer had corrected the document with a red pen. LawGeex's AI understands the text's context and the company's preferences, not just ticking based on keywords. For example, if the company policy requires every NDA to include a data retention obligation, and the received NDA lacks this, it not only notes this but inserts a suggested clause for it. Similarly, if it is present but not properly worded, it rewrites it according to the company's digital playbook. LawGeex acts like the extended arm of our legal team: it pre-screens routine contracts and meticulously marks them with red pen before the lawyer even looks at them. According to the company, this solution can achieve up to 80% time savings in contract turnarounds and enables 3 times faster deal-making. LawGeex operates as a cloud-based service, typically with contract number-based pricing (e.g., a monthly X number of contract reviews is included in the subscription, with additional fees above that). They do not publicly disclose specific prices, providing custom quotes based on company size and needs. It may also be accessible to smaller legal departments (they often calculate a quick ROI: a Forrester study mentions a 209% ROI in connection with a LawGeex implementation). During implementation, it is important for the company to define its own rules (playbook), which LawGeex digitally maps – this requires some initial work from the lawyers, but afterward, the system automatically applies them.
Besides the above, there are many other AI solutions for contract review: for example, LegalOn (Japanese origin, strong in risk flagging), ContractPodAI (with Leah AI assistant, playbook-driven redlining), or Juro (which also functions as a full CLM, with built-in AI risk alerts). It is evident that the AI toolkit for contract due diligence is rapidly expanding, but it is always true that the “red flag” points indicated by AI must ultimately be validated by a lawyer. AI does not consider the business context or the negotiation room of opposing parties – this is still up to the human lawyer to decide whether the identified risk is acceptable or needs renegotiation. Nevertheless, experience shows that these tools drastically reduce monotonous checking work (often reporting >50-70% time savings) and make quality control more consistent, as they do not overlook small details that a tired human eye might miss.
Automation of Administration and Internal Office Work
Legal work does not consist solely of purely legal tasks but also involves a lot of administrative burden: calendar management, meeting minutes, deadline tracking, time recording, invoice preparation, internal training, knowledge management, etc. In these areas, generative AI can also be a useful assistant, primarily by summarizing information and initiating tasks on behalf of humans.
The most straightforward example is Microsoft 365 Copilot, which has gradually become available to business users since 2024. Copilot is an AI assistant integrated into Microsoft Office applications (Word, Outlook, Excel, PowerPoint, Teams, etc.), appearing, for example, during Teams meetings or while writing emails in Outlook, offering its assistance. In a law firm, it can typically be deployed for tasks like: automatically creating a memo from a Teams meeting recording, highlighting important decisions and action items; or providing a short summary of a long email thread, so you don't have to read all 20 emails. Copilot can also answer questions based on our calendar, emails, and chats (this is called Business Chat): for example, we can ask, “what is the status of the XY project?”, and the AI gathers relevant information from our emails and documents and provides a summary. It can also assist with time tracking: if a lawyer wants to record their work hours from the previous week, Copilot (based on calendar entries and document edits) can suggest how much time was spent on which project – although this is still an experimental feature, some legal software (e.g., Time by Ping) already offers automated time-tracking with AI. Copilot's text summarization and letter-writing capabilities are also very useful: for example, it can create a short extract from a long court judgment (even in Hungarian, as Copilot works in multiple languages through language models), or draft a response letter to a client's email, politely incorporating the content of a previous discussion. Of course, the lawyer fine-tunes and sends it, but generating the draft saves significant time.
Microsoft 365 Copilot is currently available with a separate subscription in enterprise packages: from early 2024, it is also available to small and medium-sized businesses at a price of 30 USD per user per month (in addition to Enterprise E3/E5 packages). This is not cheap, but considering that an expensive lawyer can save 1-2 hours a day on monotonous administration with AI, it can pay off. It is important to highlight that Copilot operates within corporate data privacy frameworks: Microsoft guarantees that data processing remains within the tenant, not learning from other clients' materials (unlike public ChatGPT). Thus, an internal meeting memo or correspondence of a law firm can be relatively safe to use with Copilot – of course, with proper internal regulation.
Microsoft is not the only one in this area: Google has also introduced Duet AI for Google Workspace (Gmail, Google Docs, etc.) users, with similar features. Additionally, several startups offer specialized office AI assistants. There are also solutions specifically focusing on the legal domain, such as Casetext CoCounsel's timeline creation function for litigation cases (establishing chronology based on documents) or Harvey AI's workflow automation capability (linking repetitive tasks). All these aim to allow lawyers and their assistants to focus on value-creating tasks instead of mechanical chores.
Overall, AI in administrative work plays a secretary-assistant role: it does not make legal decisions but remembers, reminds, prepares, organizes. When implementing it, attention should be paid to colleagues learning to use it effectively (e.g., giving good instructions to Copilot) and checking the output – since even an automatic meeting memo can contain misunderstandings if the audio recording was unclear. With proper control, however, AI can take a huge time pressure off our shoulders and reduce the risk of human errors (e.g., not forgetting deadlines, not overlooking an important email).
Client Communication and AI Chatbots
In legal practice, communication with clients is crucial – and AI is also starting to appear in the background here. While direct legal advice will always remain a trust-based human activity, there are standardizable interactions where an AI chatbot can relieve the office. These include initial client information on the website (FAQ chatbot), automatic scheduling of appointments, or answering simple client questions (e.g., “When will my contract be ready?” status updates).
A widely known example is DoNotPay, which bills itself as the world's first “robot lawyer.” DoNotPay originally started as a simple chatbot that helped appeal parking tickets – the app guided the user through a few questions and then generated an official letter to the authorities. It later expanded to cover many other small legal-procedural matters: e.g., canceling subscriptions, filing compensation claims with airlines for delays, minor consumer complaints, etc. Currently, DoNotPay can handle over 200 types of “cases” automatically through a simple mobile app interface. Its business model is tailored to individual clients: about 36 dollars every two months for unlimited use (~18 USD/month), which pales in comparison to a lawyer's hourly rate. Of course, DoNotPay is not suitable for complex legal cases, but its essence is to provide a cheap, automated solution for simple legal problems that people wouldn't otherwise consult a lawyer for. Here, AI performs document generation: it fills templates with user-provided data and polishes the letters into more natural language with a language model. Interestingly, DoNotPay even planned for AI to argue in a real court hearing through a defendant's earpiece – this was eventually canceled due to warnings from bar associations, but it shows the direction of experimentation (the case highlighted the ethical and regulatory boundaries, see below).
From a law firm's perspective, DoNotPay is more of an attention-grabbing example but not a direct competitor – in fact, it can even be coexisted with such solutions, e.g., offering a free automated helpdesk-like chatbot for simple cases on the firm's website, while more serious cases are handled by a lawyer anyway. Several large international firms are also experimenting with developing their own AI chatbot, which, for example, answers frequently asked questions on the website (pricing, areas of expertise, “how do I file a lawsuit?” etc.) or helps fill out forms on the client portal. These chatbots often build on the GPT-4 model but are specifically fine-tuned with the company's materials and operate within narrow limits (to avoid giving unsolicited legal advice).
Another application area is preparing client submissions. For example, consider a larger consumer rights case: many clients need to file similar claims (like in a class action, but individually). For this, an AI solution can create a web interface where the client simply fills in some data, and the AI generates the personalized submission in the background. This has been done, for example, in the USA for parking tickets or flight delay compensations. Here, AI also uses the template+generative module combination: there is a lawyer-approved template, and it adapts it to the unique data in natural language.
Regarding the language of client communication, AI's great advantage is that it can speak in any language (modern models work in dozens of languages). So, a Hungarian lawyer can have a chat conversation with a non-English-speaking client with AI's help, where it translates and generates responses in real-time. Naturally, data security is also critical here: such a solution should only be used with appropriate encryption and data management guarantees, otherwise sensitive client information can easily leak.
Tips, Risks, and Ethical Considerations for AI Implementation
Data Protection and Confidentiality: Protecting confidential data is paramount when using AI tools. As a law firm, we cannot upload client data to public AI services (e.g., the public version of ChatGPT), as we do not know who has access to the data in the background, and both GDPR and attorney-client privilege prohibit this. Therefore, it is advisable to choose solutions that guarantee enterprise-level data security: for example, Microsoft 365 Copilot business version, Azure OpenAI for custom developments, or legal-specific tools like Spellbook, which contractually adhere to the zero-data-retention principle. Anonymization is also important; if the chosen AI tool does not anonymize automatically, we should develop an internal procedure for this (e.g., have an intern redact names from the contract before uploading it to the AI). Additionally, review the provider's terms and conditions: many consumer AI services stipulate that they can use the input data for further model training – we cannot allow this with legal data. Instead, choose a legal AI platform that also provides an audit trail (logs who sent what data to the AI and what they received back) and allows certain data to be blacklisted (e.g., the client's name should not appear in the prompt). New solutions are emerging in this area: Lega.ai, for example, is a platform specifically aimed at enabling legal organizations to experiment with generative AI in a secure environment – providing a central administration interface where user access can be managed, checkpoints set for compliance, and usage audited. In such a framework, a law firm can introduce a ChatGPT-based tool while adhering to its data security rules.
Accuracy and “Hallucination”: Generative AI models can be prone to inventing plausible but incorrect information if not properly constrained. This is particularly dangerous in the legal field – consider the infamous case where an American lawyer cited non-existent court cases generated by ChatGPT in his submission because he did not think the AI would confidently provide false answers. To avoid such cases, never rely blindly on AI-generated text. “Factual verification” remains the lawyer's responsibility: every judgment or statutory reference must be checked in the official source. Similarly, contract drafts must be read to ensure they actually say what we want and contain no contradictions. Fortunately, some legal AI tools offer built-in solutions for this: for example, Thomson Reuters CoCounsel provides integrated Westlaw references, and the LegalFly platform adds a plain language explanation to every AI-suggested change by default – this makes it transparent why the AI suggests what it does. The best practice is to build in double-checking: the AI result is first reviewed and corrected by the using lawyer, and if the document is important, another colleague also looks at it. This way, AI speeds up the process, but the chance of errors is minimized.
Legal Responsibility and Ethics: From a legal perspective, it is important to clarify that using AI does not reduce the lawyer's responsibility for the work performed. If an AI gives bad advice and we act on it, we are still responsible for the damages to the client, not “the AI.” Therefore, bar associations worldwide emphasize the principle of competent use: you can use AI, but as you would any assistant or resource – you need to know what you're doing and check the result. Some ethical guidelines are emerging: for example, it is recommended to disclose if AI was used in a work (transparency), especially if some content was generated by it. In certain sensitive areas (e.g., criminal cases, privacy rights), some legal organizations may even prohibit the use of open AI due to confidentiality – always check local rules. The EU's proposed AI regulation (AI Act) will also address “high-risk” AI applications, which legal use may fall under, so compliance with regulations will also need to be examined in this area in a few years. On the client side, we can expect more conscious clients to ask: “A robot didn't make my contract, did it?”. It's good to prepare communication in advance for this: explain that yes, we use modern tools to increase efficiency, but all outputs are approved by a lawyer, and AI only helps with routine work. In fact, using AI often serves the client's interest, as it can result in faster service and lower hourly rates for certain tasks.
Implementation Strategy: It is advisable to start small with implementation – e.g., within a pilot project. Select one or two specific tools and tasks (e.g., Spellbook for NDA review, or Copilot for meeting notes), and have a small team try them in real use for a few months. Measure efficiency (time savings, error rate) and collect user feedback. Training is important: lawyers need to learn how to “prompt” – i.e., how to give precise instructions to the AI for the best results. There are already separate training sessions for this, but colleagues can also share their experiences internally. Based on the pilot results, the firm can then decide on wider implementation. Initially, it is also worth creating an internal guideline for AI use: e.g., in which case types it is allowed, what data can be entered, whether the client needs to be informed, etc. These rules can be refined over time, but it's better to have a framework.
Finally, remember that AI is not a magic bullet, but another tool in the lawyer's arsenal. Just as word processing programs or online legal databases once revolutionized work, AI now takes the next step. Law firms that approach it openly and critically – i.e., leveraging its advantages while being aware of its limitations – can gain significant competitive advantages. The goal is not for AI to replace the lawyer, but to free up the lawyer's time for higher-level professional work while the machine handles monotonous or easily automatable subtasks. This way, the client also benefits: receiving quality legal services faster and more cost-effectively, while the lawyer, with the support of their “machine assistant”, can focus on the more important advisory role.

Zoltán Kéri