_仓 颉 的 诗 2020
Cangjie's Poetry is the second episode of Cangjie
*Compared to the last chapter, this new episode is beyond a mere visual celebration, but more importantly, a semantic human-machine experience that evokes feelings.
Humans and machines are in constant conversations. Intelligent machines today are not only observers of the world, but they also make creative decisions. If A.I imitates human beings to generate a symbolic system and actively communicate with us based on their own understandings of the universe, to what degrees their messages and meanings will recontextualize our coexistence?
Visualization & Sound Design
Intelligent System Design
Intelligent System Design
Intelligent System implimentation
Interactive Media Art Installation
VIDEO DOCUMENTATION [ 2 minutes ]
Introduce Cangjie's Poetry 仓 颉 的 诗 :
Cangjie’s Poetry is an intelligent multimodal system designed as a conceptual response to the future semantic human-machine reality. Inspired by an ancient Chinese legendary historian, Cangjie (c.2650 BCE), who invented Chinese characters based on the characteristics of everything on the earth, we trained a neural network that we call Cangjie, to learn the constructions and principles of over 9000 Chinese characters. After successfully training, Cangjie can interpret images through the lens of Chinese characters and produce new symbols constructed by Chinese strokes. Meanwhile, we implement a pre-trained model (DenseCap) to simultaneously generate localized descriptive sentences of images in natural language to create meanings for this symbolic system.
In the art installation, the Cangjie system captures surroundings using a camera and transforms the real world streaming into a cluster of ever-changing new symbols in real-time. Those novel symbols made of Chinese strokes tangled with the imagery captured by the camera in the installation are visualized algorithmically as an abstract pixelated landscape. The landscape dynamically moves, evolves, and writes poetry based on the image data of the live streaming captured by the camera. We project the visualization of the semantic landscape on the wall in the exhibition space as the first projection. In the meantime, Cangjie also generates descriptive sentences of surroundings based on its interpretation. Those sentences are designed as flowing poetry written in ink, assembled with real-time captured imagery fragments as the second projection in the space.
A gallery mock-up view of Cangjie’s Poetry’s art installation. (left) The first projection: live streaming imagery writes new symbols. (right) The second projection: intelligent visual poetry describes surroundings, 2020. (© Weidi Zhang )
Cangjie’s poetry’s intelligent system mainly consists of two parts: 1. A neural network that transforms the live streaming image data into a cluster of new symbols constructed by Chinese strokes in real-time. 2. A pre-trained model (DenseCap) both localizes and describes salient regions in the live streaming image using natural language.
The first visualization presents an ever-changing poem written in the novel symbols generated through the lens of Cangjie, which continuously evolves and composes poetic ink imagery. We use unsupervised learning techniques trained a neural network (named Cangjie) to learn from over 9000 Chinese characters' vector stroke data. After successful training, the discriminator and the encoder/generator reach a stable state like a Nash Equilibrium. The network learns a low-dimensional latent representation of these images. Thus, when the system processes the live streaming of the surroundings, the encoder network can produce its latent representation. Then the generator network can reconstruct the image and generate novel symbols based on the given latent representation.
The second visualization is an evolving concrete poem that is written in natural language that both make meanings for those new-born symbols and communicate with audiences in English. The visualization aesthetic is inspired by concrete poetry in the 20th century, which arranges linguistic elements topographically in the space to convey meanings. We utilized pre-trained fully convolutional localization networks to determine the arrangement of sentences and generates multiple localized descriptive sentences in the multiple regions of the image. The model also outputs a confidence score to value how much the descriptive sentence corresponds to the ground truth. In the visualization, we purposefully select the lowest confidence outputs (the most inaccurate result) and the highest confident outputs (the most accurate result) and combine them as interactive intelligent poetry that produces metaphors with ambiguity and unexpectedness.
We utilized the DenseCap model to determine the arrangement of sentences in the multiple regions of the visualization. We design the texts of the inaccurate results and the accurate results in the handwriting style with ink effects. The letters’ size, spacing, and curvature are determined by the confidence score evaluated by the DenseCap model. After layers of image processing techniques and creative programming, the second visualization’s outcome is an evolving collage of real-time video fragments with machine-generated texts in ink wash painting style.
Visualization I Cangjie writes poetry using pixels of live streaming textures, 2020. (© Weidi Zhang)
Visualization II Cangjie describes surroundings as a visual form of poetry. (© Weidi Zhang)
Introduce Special Edition 仓 颉 的 诗 特别版 :
As an artistic response to the Covid 19 pandemic and social distancing regulations, instead of using real-time live streaming of surroundings in the art installation as a video feed for the Cangjie system, we collect and curate submissions of daily footages worldwide. We edit the submitted footage from Canada, China, UK, United States, Japan, Korea, and Egypt. Cangjie interprets the edited footage and generates the visualization based on its interpretation. We render the animation generated by Cangjie and design the sound using the technique of granular synth.
Compared to the original interactive version, this special edition of visualization is a pre-rendered audio-visual experience. Therefore, it can not be interacted with by the audience in real-time. However, it travels and extends Cangjie’s vision to the different corners of the world, connecting the isolated memories with more diverse contents than art installation environments. Meanwhile, the personal footage of daily life builds a new level of intimacy and evokes feelings during this particular time.
Special thanks to [ original video footage provider ]:
Abdalla Morsi, Adam Menter, Ahmed Sarhan, Lijiaozi Cheng, Danica Sapit, Jing Yan, Jordan Gray, Lisa Kolb, Mert Toka, Mohammad Helmy, Myungin Lee, Nataliia Frank, Quinn Keck, Shiyu Lv, Sijia Li, Kaiping Sui
Cangjie is not only a conceptual response to the tension and fragility in the coexistence of humans and machines but also an artistic expression of a future language that reflects on ancient truths, a way to evoke enchantment in this artificial intelligence era. The interactivity of this intelligent system prioritizes ambiguity and tension that exists between the actual and the virtual, machinic vision and human perception, and past and present.
From learning over 9000 Chinese characters, Cangjie creates its own symbolic system. It observes the real-world, writes poetry using its symbolic system, and explains it in natural language to audiences.
Just like the legendary historian CangJie did nearly 5000 years ago.
It continuously writes poetry with humans collaboratively,
as long as the real world exists.