Apple presented its “neurophoshop” with open source code (4 photos)

Apple in collaboration with researchersfrom the University of California presented the MGIE neural network. You can load a photo into it and edit it not using traditional settings sliders, but through text commands in natural language. Anyone can try it out in action.

The proprietary service currently only supports descriptions in English. Judging by the first tests and reviews from journalists, MGIE understands queries relatively accurately and can do the following:

– Apply filters
– Perform color correction
– Crop, rotate images
– Edit individual objects in the photo
– Adjust contrast, brightness, saturation

For example, you can ask a neural network to make a faceThere are fewer people, the moon is more contrasting, and the colors of the clothes are less bright. MGIE itself identifies the objects in the photo and changes them according to the request, or edits the entire photo.

As a test, the authors of the 9To5Mac portal uploadedphoto into the neural network and asked to “make the sky a little redder.” As a result, MGIE interpreted the command as “make the sky in the photo a shade of red rather than shimmering blue” and returned the result.

The MGIE algorithm is not yet available as a separate algorithmservice - its source code can be downloaded on GitHub, and in the form of a working demo it is presented on the HuggingFace company website. It is not yet known whether Apple will use the neural network in its own products.