CallisonRTKL | Build: London ’19 for Architecture | Unreal Engine

CallisonRTKL | Build: London ’19 for Architecture | Unreal Engine

[APPLAUSE]>>Can you hear me? Good. So I’m not going to do a live
demo for obvious reasons. But I am really wanting to talk to
you about data-driven design and Unreal. A very exciting topic mostly
because if you put the word data in front of things,
people go, oh, technical, which is fun. Especially architects–
we architects love data-driven design. And why? Well, because it really helps
drive the projects forward. Right now we are
creating the smartest, most well-developed
projects that have ever come out of history. And this is because
of the data that we’re gathering from across the
globe from across the different disciplines as well. We’re collecting data
on sustainability. We’re collecting data
on value of engineering. We’re collecting data all the
way down to the minute details of facade development design
where we can design facades to optimize for visibility,
for energy consumptions. And so this is really
just due because we’ve been using all the new
advanced tools which will be coming into fruition. And the big question, which
we always were asking, was where does
Unreal fit in this? Where does a real-time
gaming engine come into that data-driven
design aspect of Unreal? And the obvious
response is, well, it’s a real-time rendering tool. It’s great. We can use it to
visualize things. And we can create gorgeous
visuals. We know that. We see examples
of it all the time in amazing videos of
super cool speakers. But with the introduction of
VR and immersive technologies, we were like, why
are we not– well, this is a big opportunity
because all the data collection excludes one really,
really important feature and that is you. It’s me. It’s this waving man who I
spent three hours making. Essentially, we are the people
of the future who are going to be occupying these spaces. We are the people who are going
to be working in the offices. We are the people who are
going to be shopping in retail. We are the people– if we’re
born in the wrong generation, we’re not going to be able
to afford the housing. But regardless, it’s
important the design should be focused around you. So how is it that we
do that within Unreal? Well, we need to, first
of all, look at a way to bridge the two realities. Because we have digital
reality which is very fancy. That’s where we work.
We love digital reality. We create stuff in VR, it’s
phenomenal. But we live in reality. So there really needs to
be a commonality identified between these two spaces. Because this is us. We actually are there to
build real things, real spaces that we’re going to occupy. So that is what is
important to us is actually making reality itself. So to do that, we kind
of start playing around. I wanted to just show a quick
research and development project which we started
on called Project Insight, not to be confused with the Marvel’s
Project Insight, which was– I think if there’s
any geeks in the room, they’ll remember this. It collects data to kill people. Ours is just a little bit
friendlier than that. Using Unreal– it’s lovely. But a similar principle is that
we have this wealth of data. We need to be collecting it to
make more informed decisions– not kill people– and drive the projects
and the designs forward. So imagine a space
where the space is designed around
where you want to walk, what you want to
look at, where you’re going. There’s no obstructions. There’s no anything
in there that kind of makes you feel uncomfortable. And to enhance the
decision-making process– this is the most
important thing. We’re steering away from
design theory and more into design fact because
we’ve got the data to suggest that it’s true. So to do that, we kind of
started all the way back with the digital
reality drivers. What data is it that we
can start to gather, very simply, from our Unreal models? And we can do this very
easily with Unreal. It’s got phenomenal
tools because again, it’s the gaming engine side of
it that allows us to do it. So we build these
Blueprints in code which runs seamlessly in the
background of every single one of our experiences whether
we want to use the data or not because we want to build
up this vast wealth of it. And initially, we looked
at just collecting four very simple things– user
position, target position, target mesh, and time. And I’m going to quickly just
go over why each of these are really important
to the design process. User position– this
is movement, where people move through spaces. So we’re always tracking on
where people are wanting to go, where they’re not going,
to help enforce, maybe, the common paths of travel. Where is that? Or the path of
least resistance– where are people stepping
over stuff in order to get somewhere faster? At the same point, it allows
us to clearly identify where the dead zones are in projects. Where people are not going? Or where are people going that
maybe we don’t want them to go? And how is it that we can
modify that urban master plan to change that? Target position’s
a very fun one. This is basically
monitoring the site data– where we’re looking,
what we’re looking at. And what’s
interesting about this that we kind of
discovered very early on is that it’s about
the viewer angle. Architects love to look up. We just think everyone walks
into a space and goes, oh, wow. No one does this. Normal people walk
into a space– and I say normal because
we’re not normal– normal people walk
into a space and they keep everything eye level. And that’s the data we’re
collecting showing us this where looking up is very
uncommon unless you’re actually there to see something
very special. In malls, we don’t want to
put too much into skylights because no one actually
knows what the skylights look like in malls. But the more important thing
we can do with the site data is we can actually
use this in Unreal to hit against
particular object meshes. What are the meshes which are
getting the most amount of hits based on each user experience? So we can identify very
clearly very important parts of the project and parts
which no one is looking at. But more interestingly,
if we move that building and we put it somewhere
else, we move the art sculpture or the
green wall, does it still have the same value? Does it still have
the same attention? So we can then create
smarter spaces based on that. And lastly, time.
Time was an interesting one. We wanted to use it
as a way of evaluating how long someone spends in
any particular zone in VR. What it actually showed
us is that people spend very little time
in VR spaces, which was really useful for
allowing us to actually create unique experiences within
Unreal and to really focus to get the most important
data out of the VR experiences as possible
as fast as possible. But all of this is
useless unless we can prove that it
works in reality or there’s a
commonality in reality, that there’s a coexistence, that
the way that people move in one is the way that people
do it in reality as well. So to do that, we had to look
at a couple of reality drivers that we could use as
benchmark comparisons against the digital ones. So traffic flow analysis,
user engagement, target value, and time– and, obvious one,
traffic flow analysis. This is very easy. We partnered with
some sub-consultants, put up sensors in some
CallisonRTKL projects, and monitored people’s movements
throughout these spaces. And how does that then correlate
to the digital movement? How does that
actually look whenever you put those two data
sets side by side? User engagement– we had this
mad idea that we would just give people Google glasses-type
things that monitors where you’re looking in real space– lots of issues with that. But we ended up thinking
that the smarter way to do that would actually
be using The ‘Gram, I think, as the kids are calling it– The ‘Gram, Tumblr– does
that exist anymore? I have no idea–
Flicker. These geographical
social media tags are really the way
that we can identify what spaces with inside
projects are important. In this room, there’s people
taking selfies of big signs or whether it’s the special
room or a special chair that they’ve seen. And again, attributing that
to specific features in a zone and then seeing whether or not
that compares to our data point cloud that we collected
within Unreal. Very exciting– now, time
I always throw in here just because what we discovered after
lots and lots of digging is there’s absolutely
no correlation between digital time and
time spent in reality. And this is true because,
I assume, based on fact that we don’t have mobile
phones in digital reality. We don’t tie our shoelaces
in digital reality. And the general
distractions are not there. But there were still
some data points that we were able to collect
from the reality version and actually find
the commonality in the digital sense as well. So that’s really the
important part of it because we’re really then trying
to take these two data sets and actually see how
they speak to each other, and what percentage
of accuracy there is going from one to the other. Now, I know what
you’re thinking. The way that I move around in VR
is in no way similar to the way that I would move
around in a real space. And you would be
absolutely right. There’s just no
commonality between me jumping on tables, through
lawns, and on top of pools– because that’s
what you do in VR– compared to me in a
real space unless we’ve had a few drinks, in
which case we’re dancing on tables exactly the same. But that’s why we invoke
the wisdom of crowds, which is a basic sort of the strategy
where we look at the idea that the opinions or actions
of one person to another means absolutely nothing. But as a collective,
as a crowd, whenever we start to look at
this, we can actually start to bring some points
of similarity across from it. So whenever we start to take
not just data from 5 people or 10 people from
a particular space but we overlay this onto
many huge data sets, we can start to create this heat
map of movement or target data within a digital space and
do the same in reality, and see what the points
are where they actually start to conglomerate and
actually find a way that they actually start to match. And this is what’s
really important. It’s the collection of
the vast amounts of data. It has to be something which
we’re constantly collecting, constantly working with
because the idea of this isn’t just to
enforce new designs. The advance of this
is to actually use it in the digital twin modeling. If we have a project
which is already existing and we build a
digital clone of that, then we can then play
around in the digital space, see how people react in it,
or vice versa– play around with the reality data and use it
to influence the digital space. Because as I said, the
most important part of this is increasing the efficiency. And the more data we
collect, the better it goes. And we can explore
additional data drivers. I know there’s certain people
who are putting helmets on people and
scanning their brains, trying to sense
emotions in space. It’s mental. And all this data is
really useful in actually targeting this crossover between
reality and digital reality. Because at the end
of the day, that’s what we’re looking to do. We’re looking to create
spaces based around you guys, about the way that you
find comfortable living, the way that you move
around the space, and what it is you
want to look like. And the more that we
do this, then the more advanced our designs are
actually going to be. So we’re continuing
this down the line. And we’re very excited with
the direction it’s going. I realize I just had
a really short time to kind of go over that. But if you see me
outside of this, then please come and
ask me some questions. I would love talking about it. But I’m David
and that’s our lovely project. Thank you. [APPLAUSE]


  • joicolado 102034

    February 11, 2020


  • Sapphire Spire

    February 12, 2020

    I prefer design driven data.

  • Matheus Lacerda

    February 13, 2020

    I like how he is so excited about he's presentation. Rlly makes you think he has some cool stuff to show, which he had


Leave a Reply