Feature “Deceptively spacious.” “Prime location.” “Up-and-coming area.” “Some original features,” which occasionally turn out to be asbestos. Estate agents are known for sometimes stretching the truth in pursuit of a sale, but the generative AI boom appears to have thrown things into overdrive – providing an easy way to present images of properties which simply don’t reflect reality.
Property photography is an art, more than a science: careful positioning of the camera, the use of extreme wide-angle lenses to create the illusion of space, and even “staging” rooms with furniture made to a slightly smaller scale are just some of the tricks of the trade. Image-generating models, though, have given agents an entirely new way to present a property.
This listing for a £350,000 (around $468,000) property in Durham is a prime example. The lead image shows a suspiciously smooth frontage, devoid of texture. Closer inspection reveals the tell-tale signs of generative AI: an awning over the door that doesn’t quite make sense, a wall that turns into a hedge partway along its length, and a flowerbed which appears to be planted in place of the neighbor’s path.
ABOVE: The AI-generated and BELOW: the real Eaglescliffe house frontage, as they appear on the estate agents’ website – click to enlarge

Click through the photos and, some 22 images later, you’ll eventually find the original photograph fed to the image generator, and it tells a very different story. The flowerbeds and hedges are gone, the short roof over the door replaced with something more aligned with what you’d expect to see, but more importantly there’s an entire commercial property adjoining the premises to the left which was edited out of existence in the AI’s fever dream – a hair and beauty salon whose roof literally touches the house’s bay windows, yet is completely absent in the listing’s lead image.
An AI-assisted room in the Eaglescliffe house – note the en suite’s door
The same room pictured above without the AI assist
Other photos in the listing flip between the house the agents are selling and the house, presumably, they’d like to sell. Empty bedrooms are “staged” with virtual furniture, but structural elements are modified in the process: an en suite suddenly grows in size and its door turns half-transparent while opening the wrong way; an oven and countertop appear in the kitchen where a radiator sits in reality; the toilet in the bathroom switches walls entirely – and gains a floor-length curtain which, in defiance of the laws of physics, plumbing, and common decency alike, cuts right through its waste pipe, while one leg of the sink pedestal passes clean through a wicker basket.
Only losers connect their toilet pans to the waste pipe – which we’re sure you’ll agree totally ruins the aesthetic. Just add in a lovely, lovely (handily beige) toilet curtain instead
As the late Victor Meldrew might say, “I don’t believe it” – and nor should you.
“For me, the use of AI for imagery in property listings is a major red flag aligned to what was previously covered by the Property Misdescriptions Act,” Adrian Tagg (MRICS), associate professor of building surveying at the University of Reading, told The Register.
“While you’d think that those spending hundreds of thousands of pounds on something would do their due diligence, it’s always surprising to me how few people actually have a ‘proper’ survey which is often a miniscule percentage of the agreed sale price.
“Unless explicitly stated in the property details and on the associated images, altered images are (in my opinion) a property misdescription or misrepresentation. As an academic and practising chartered building surveyor (professional member of the RICS) we’re bound by regulations to deliver evidence-based opinion and hold a duty of care to deliver correct, appropriate advice. For this we are obliged to have professional indemnity insurance to cover the cost of litigation for getting things wrong.
“Estate agency has never really had this professional duty, and ultimately it’s all about sales and doing ‘the deal.’ Therefore I’m not surprised that there appears an openness to accept AI when ultimately it’s an industry with little obligation to be accountable for their actions.”
Roseberry Newhouse, the agency responsible for the alternate-reality imagery of the Station Road property, isn’t alone in turning to AI to tart up its listings. Back in November 2023 consultancy firm McKinsey & Company was espousing the benefits of generative AI for the real estate industry, predicting it “could generate $110 billion to $180 billion or more in value.”
Startups like REimagineHome, eager to snatch a slice of that pie, offer “virtual staging,” in which empty rooms can be filled with furniture at the click of a button – and even presented in various styles to suite the tastes of particular would-be buyers.
“AI staging is in effect the same as CGI used to sell off-plan apartments or houses by property developers,” Tagg told us of these services, “and as long as this is explicitly indicated on the images then persons of reasonable intellect should be able to understand this. Dropping images of furniture onto existing property images is not really AI, most kids doing Minecraft adopt the same skill set.”
Dropping a few tasteful virtual Chippendales into a property is one thing, but altering a room’s underlying structure – the magical toilet, the TARDIS-like zero-depth built-in wardrobes, radiator removal, and in one case the flattening of a wall with a very obvious dog-leg section, not to mention the demolishing of the attached commercial property and enfloration of the neighbour’s pathway – would seem to go a step or six too far.
Historically, such misleading listings would fall foul of the Property Misdescriptions Act 1991, a piece of law which made it illegal to misidentify various property aspects – but this was repealed in 2013 with its closest replacement being the Consumer Protection from Unfair Trading Regulations 2008, a more general piece of legislation which does not specifically target property sales.
Approached for comment on whether sellers had approved the generative imagery and if Roseberry Newhouse disagreed that the presentation of the property and its environs was misleading bordering on potentially illegal, a spokesperson could only offer a sharp intake of breath and a declaration that “we’re a little busy at the moment.” Questions sent to a provided email address had not been answered by the time of publication.
Roseberry Newhouse, at least, provides both genuine photography and a 360-degree walkaround of the house as it exists in reality, to act as a comparison to the “aspirational” gen-AI shots of a sports car in the drive and no beauty salon outside your front window. Other agents aren’t quite so scrupulous.
“A mate of mine is house-hunting just now, and a month or so back he went to view a house only to discover that all of the pictures from the website were AI images and the actual house was in considerably poorer condition,” a reader told us of a less legitimate agency. “He walked out immediately, unsurprisingly.
“On very close inspection the AI images did bear a tiny tiny watermark in the bottom corner, but it was so subtle that you would never see it unless you were purposefully looking for it.”
“There’s a reason why, in a online world, houses remain one of the few things that cannot be bought by clicking on a mouse,” Tagg concluded of the real estate industry’s rush to embrace technology. “It still needs human intervention and trust alongside transparency – this is paramount.” ®