Romantic painters such as Caspar David Friedrich used landscapes to reflect the mood of their subjects — an extension of their consciousness and emotions. Several centuries later, a group of scientists has revived this idea using artificial intelligence to determine the mood of today’s most important human environments: cities.
Â
Researchers at the University of Missouri have developed a system capable of mapping the emotional state of citizens across different neighbourhoods. It works by analysing geolocated images and social media text, applying sentiment analysis techniques — an approach first developed in the 1990s to interpret positive or negative opinions in written language.
Â
By combining natural language processing with computer vision, this technology could provide an emotional layer to urban analysis — complementing traditional indicators related to infrastructure or mobility. Could this become a tool for improving well-being beyond standard planning metrics?
The method developed by professors Jayedi Aman (architecture) and Tim Matisziw (geography and engineering) is based on two complementary layers of analysis. First, it gathers public posts on social media platforms such as Instagram that include geolocation data. The accompanying images and captions are processed using language models that classify the emotions expressed — happiness, calm, stress or frustration. So far, this is similar to existing sentiment analysis initiatives.
Â
The second layer is where the project becomes truly innovative. Using city images from sources such as Google Street View, the researchers apply computer vision techniques to identify visual elements of the urban landscape — vegetation, pedestrian areas, walls, fences and building density. This multimodal approach allows text and image data to be cross-analysed.
Â
The two datasets are then correlated to generate what the researchers call a “sentiment map” — a spatial representation showing which areas of the city display higher or lower levels of emotional well-being. For example, if posts tagged near a newly opened park show more “happy” classifications, planners can investigate which design features contributed to that response and replicate them elsewhere.
Â
One of the next steps for the team is to integrate this system into an urban digital twin — a real-time replica of the city that visualises not only infrastructure and movement, but also collective emotional states. The researchers believe that, in the future, such systems could complement or even replace traditional opinion surveys.
According to Aman and Matisziw, their technology could support several areas of urban planning:
Â
- Identifying areas associated with negative emotions, which may signal shortcomings in infrastructure, design or local services.
- Assessing the impact of interventions — such as new green spaces or pedestrian-friendly redesigns — by measuring changes in sentiment before and after implementation.
- Integrating “mood maps” into city monitoring systems, alongside existing indicators like traffic, air quality and safety, to provide a more holistic picture of urban life.
Â
These are promising applications — but practical limitations quickly emerge. Natural language processing still struggles with irony and sarcasm. And on social media, even humans often fail to detect them consistently.
Â
There is also the issue of representativeness: those who post online are not always a true reflection of the wider population. Privacy and the ethics of using geolocated information are other important considerations.
Moreover, while the results show correlations between certain features — such as greenery or wider pavements — and positive emotions, causality is not guaranteed. Social and economic conditions, cultural factors or even the time of day may play a role.
Â
On the technical side, as highlighted in a recent paper in Information Fusion, multimodal systems can suffer from a lack of high-quality datasets, inconsistencies between modalities (text versus image), and reduced effectiveness when applied in different cultural contexts.
We often talk about the Smart City: a model in which smart electricity grids, electric mobility, advanced sensor networks and digital twins converge to improve efficiency, sustainability and quality of life.
Â
However, there is a much more ambiguous aspect, which is the measurement of the urban experience and its emotional aspects. Urban planning, approached in a hierarchical and vertical manner, can have unintended consequences, however well-intentioned it may be. That is why initiatives such as that of the University of Missouri are an opportunity to explore this elusive dimension. Â
Â
Still, one deeper question remains unanswered: even if we can measure the emotions of a city, can we truly automate happiness? Philosophers and theologians have tried to understand it for millennia. Whether algorithms can ever come close remains to be seen.Â
Â
Source:
David is a journalist specializing in innovation. From his early days as a mobile technology analyst to his latest role as Country Manager at Terraview, an AI-driven startup focused on viticulture, he has always been closely linked to innovation and emerging technologies.
He contributes to El Confidencial and cultural outlets such as Frontera D and El Estado Mental, driven by the belief that the human and the technological can—and should—go hand in hand.