These pictures could come from any house party on any night of the week in any part of the country.
Except, they’re completely fake.
The pictures were generated by an artificial intelligence. The house party never took place and the people in the photos don’t exist.
Shared on Twitter over the weekend, the images were conjoured by an AI platform called Midjourney. Similar to OpenAI’s Dall: E image generator, Midjourney was has been in open beta since July last year.
As more people use the platform, it learns and refines and gets better at producing realistic-looking images.
And while the pictures may seem incredibly realistic at first glance, there are some inaccuracies when you look a little harder.
For example, AI seems to have a difficult time generating realistic-looking hands. In the photos, many of the partygoers have misshapen hands or a few too many fingers.
The same goes for teeth – with far too many crammed into the person’s mouth with a weird kind of smudging effect going on.
As people on Twitter have pointed out, the AI also seems to have its own implict bias. Nobody in any of the generated images has a skin tone other than white.
‘I had to be specific in order to get male-looking AI people,’ explained Twitter user Miles Zimmerman, who posted the images.
‘And even then, variation is a challenge,’ he added. ‘It definitely defaults to white people when you ask for “people”‘.
Even so, the images are certainly close enough that, with a quick glance (perhaps while scrolling a feed) the average person may not clock that they’re fake. Which, of course, makes us wonder about telling fact from fiction online as these artificial intelligence platforms continue to become even more advanced.