AI-generated art looks alarming, but it doesn’t have to be

Just a few months ago, the concept of using artificial intelligence to generate unique works of art seemed avant-garde and futuristic. Soon it will be as commonplace as running a search on Google.

Microsoft Corp. announced this week that it was making the most of its billion-dollar investment in OpenAI, an artificial intelligence research company, and bringing that company’s best-in-class artificial intelligence service to Microsoft 365, the the company’s flagship set of software services. Microsoft Designer is powered by OpenAI’s DALL-E 2 AI technology and will generate any image users type into a box, such as “cake with berries, bread and pastries for fall”.

It’s a quick step forward for DALL-E 2, which was first announced just six months ago. Although the Designer app is currently only available in beta, the rollout highlights how quickly the art-generating AI is evolving, as artists have expressed concern. Some artist names appear particularly frequently as text prompts in similar art generators, causing some to worry about the impact of technology on their careers. AI ethicists are also worried about a flood of new fake images hitting the web and fueling misinformation campaigns.

However, Microsoft’s involvement in this area is good news. The company echoes OpenAI’s limited deployment of DALL-E 2, as well as its strict rules regarding the types of images it will generate. For example, DALL-E 2 prohibits images showing explicit sexual and violent content and does so by simply removing such images from the image database used to train its model. Microsoft said it would use similar filters.

Microsoft also said it would block text prompts on “sensitive topics,” which it didn’t elaborate on, but which will most likely reflect DALL-E 2’s policy of disallowing queries related to things like politics or illegal activities, or images of well-known personalities like politicians or celebrities.

Tech ethicists have wrung their hands over the fact that open-source versions of this kind of technology, such as a tool released in August by British startup Stability AI, will lead to free access for all to fake content that infect social networks. networks and disrupt upcoming elections (think fake images of Joe Biden or Donald Trump in controversial situations).

But a carefully curated release of technology from Microsoft seems to dampen that prospect for two reasons. First, opportunistic photo fakers are more likely to have their efforts hampered by filters built into the technology. Also, as more people use these tools, the general public will become more aware that photos on the internet could be generated by AI.

It’s extraordinary that this form of creative artificial intelligence is evolving so rapidly and that Microsoft’s Designer tool will soon be alongside mainstays of enterprise software like Word, Outlook and Excel. This is, as some have already pointed out, like clip art on steroids, limited only by the imagination of the user.

It also highlights how difficult it can be to predict the direction AI will take. A few years ago, tech experts widely expected that we would have self-driving trucks and cars on the road, which would reduce the accident rate and put human drivers out of work. Today, artists and illustrators have the most reason to worry, although the nature of their work may simply change. As the artistic generation comes to the fingertips of millions, it will need to be flexible.

More from Bloomberg Opinion:

Cybersecurity needs its own Sarbanes-Oxley: Tim Culpan

OpenAI project risks being biased without further scrutiny: Parmy Olson

Taylor Swift will always be bigger than AI: Tyler Cowen

This column does not necessarily reflect the opinion of the Editorial Board or of Bloomberg LP and its owners.

Parmy Olson is a Bloomberg Opinion columnist covering technology. A former journalist for the Wall Street Journal and Forbes, she is the author of “We Are Anonymous”.

More stories like this are available at bloomberg.com/opinion

Comments are closed.