Party Wall PRO Blog

The Digital Pitfall

25 March 2026

From recent conversations with party wall surveyors, it appears that in the drive for operational efficiency, some are adopting workflows that introduce new and often poorly understood risks through the use of artificial intelligence.

One example is the extraction of owner details from HM Land Registry and the direct input of that information into large language models such as ChatGPT to automate statutory notices.

The reasoning often given is that because the data is publicly available, the associated risk is low. In practice, the legal position is more complex.

As of March 2026, with the Data (Use and Access) Act 2025 coming into force alongside existing obligations under the UK GDPR and the Data Protection Act 2018, this approach may create regulatory and professional exposure if not properly controlled.

The Misconception of the Public Record

A common assumption is that because Land Registry data is part of the public record, it can be reused without restriction.

This is a misconception. Under the UK GDPR, the source of the data does not remove the need for compliance. What matters is how and why the data is processed.

In most cases, once you determine the purpose and use of personal data such as names and addresses, you are acting as a data controller and must ensure that processing is lawful, fair and transparent.

The Information Commissioner's Office has consistently made clear that "publicly available" does not mean "freely reusable" for any purpose. Its guidance on lawful basis and data sharing makes clear that organisations must assess necessity and proportionality before reusing personal data.

Using that data within an AI workflow is likely to constitute further processing and may involve sharing data with an external provider.

Establishing a lawful basis under Article 6 of the UK GDPR is therefore essential. While serving notices may rely on a quasi-legal obligation or legitimate interests, the use of third party AI tools must still meet the test of necessity and proportionality.

Data Sharing and International Considerations

When personal data is entered into many AI tools, it may be shared with a third party, depending on the provider's terms and technical setup.

Unless a tool is configured with appropriate contractual safeguards, such as a data processing agreement and clear limitations on data use, the information may be processed outside your direct control. This could include handling in other jurisdictions or use for service improvement and monitoring, as reflected in published provider policies and regulatory guidance on cloud services.

The Data (Use and Access) Act 2025 introduces a more flexible approach to assessing international data transfers, often described as a "not materially lower" standard of protection. This builds on the UK's existing transfer risk assessment approach and adequacy framework.

In practice, organisations must assess whether the level of protection in the destination country is sufficiently comparable to that in the UK, as outlined in ICO guidance on international transfers.

Using general purpose AI tools without clarity on data residency or safeguards may therefore raise compliance questions, particularly where personal data is involved.

Evolving Professional Expectations

Alongside legal obligations, professional expectations are also developing.

The Royal Institution of Chartered Surveyors has issued guidance on the responsible use of technology and artificial intelligence, including its broader Rules of Conduct and ethics framework. These are increasingly shaping how professional standards are interpreted in practice.

Firms are expected to demonstrate that they understand and manage the risks associated with digital tools. This may include maintaining internal policies, documenting how tools are used, and keeping appropriate records of risk assessments, in line with general professional and risk management expectations.

Where AI is used to assist in drafting notices, responsibility for the accuracy and compliance of the output remains with the surveyor. This reflects established professional principles that advice and statutory documents cannot be fully delegated to automated systems.

Statutory Notices and Practical Risk

A party wall notice remains a statutory document, and its validity is determined by compliance with the Party Wall etc. Act 1996 in terms of content, timing and service.

However, the process used to prepare that notice is not irrelevant. If personal data is handled in a way that breaches data protection law, this may expose the surveyor or their firm to regulatory scrutiny by the Information Commissioner's Office or to challenge from affected parties, even if the notice itself is technically valid.

In practice, this can still create delays, cost and reputational impact, particularly if concerns are raised by adjoining owners or their advisers.

Professional indemnity implications may also arise, depending on the circumstances and the terms of the policy. Market commentary and broker guidance indicate that some insurers are beginning to review how AI related risks are managed, particularly where use falls outside documented procedures.

A Compliant Way Forward

None of this means that AI cannot be used. It means that it must be used with care.

A more robust approach includes:

Anonymisation

Limit inputs to technical and factual information. Use placeholders for personal data and insert identifying details only at the final stage. This aligns with the data minimisation principle under the UK GDPR.

Transparency

Ensure that your terms of engagement and privacy information explain how personal data is used, including any role played by AI tools, as required under transparency obligations.

Appropriate Tools

Where possible, use systems that are designed for professional workflows and offer clear contractual and data handling safeguards, including defined processor roles and data residency controls. Platforms developed specifically for party wall surveyors, such as Party Wall PRO, are structured with these considerations in mind.

Final Thoughts

Operational efficiency alone is unlikely to justify non-compliant handling of personal data, as reflected in ICO enforcement practice.

In 2026, a competent surveyor should treat data as a managed risk within their practice, not simply as input for automation. AI can support professional work, but only where it is used within a framework that respects both legal obligations and professional standards.

Start Free Trial    More Articles