Avatars, Cyborgs and Robots: Can Humans Enhance Themselves? (I)

The design of bionic prostheses and the creation of intelligent machines are some of the advances made towards enhancing our physical and intellectual abilities

Mme. Alberti's flying prosthesis.

Mme. Alberti’s flying prosthesis. Source: Courtesy of the Boston Public Library, Leslie Jones Collection.

The interaction between humans and machines has evolved on the basis of quantifying our environment and our bodies and reducing our reality to discrete data that can be mathematically processed. This procedure has improved our capacity to impact and redesign our environment. The production of highly efficient workplaces, the design of bionic prostheses and the creation of intelligent machines are some of the advances made towards enhancing our physical and intellectual abilities. These advances are interrelated within the technoscientific ecosystem that surrounds us, an ecosystem that is modifying our perception and behaviour, all the while it redefines our understanding of what it means to be human.

In 1973, the Center for Human Modeling and Simulation at the University of Pennsylvania launched the development of Jack, the first virtual human. Jack is a software package for computer-aided design (CAD) that can recreate a three-dimensional, interactive simulation of a human. This realistic representation is rendered on screen thanks to a set of data gathered from the parametrisation of the human body and its behaviour. In 1996, Jack had soft skin drawn from 6000 polygons. Underneath it was a reproduction of all human organs, connected by a structure that moved and felt just like a human’s. At the same time, Jack had rudimentary intelligence, which allowed him to interact with his environment, grab or avoid objects and recalculate his position. A database gathered by the Army in 1988 made it possible to recreate a whole family of Jacks with realistic variations in height, weight, age and gender. This way, Jack could be scaled to represent different body types, which were programmed to carry out tasks in a 3D designed environment.

Nowadays, Jack is owned and distributed by Siemens PLM Software. Jack is commercialised to produce highly efficient, safe work environments that are adapted to the anthropometric and biomechanical characteristics of the workers at the country of implantation. This advanced anthropomorphic scaling, a result of using specific databases for each country, allows for a highly efficient connection between the workers and the machine they operate with, resulting in the automatising not only of production but also of the workers’ actions.

The development of computer graphics started in 1962, when Ivan Sutherland created the first Sketchpad system at MIT. Sketchpad allowed users to draw on the screen using a light pen whose lines were read by the computer as vectors (data structures that recorded the light pen’s control points, screen coordinates and direction) and then codified as the mathematical expression of a primitive geometrical figure such as a line, a curve or a polygon. These geometrical shapes were treated as objects that could be modified in real time using algorithms in order to scale, assemble or instantiate them. Computer graphics production evolved with the use of more complex data structures (capable of codifying three-dimensional objects) and algorithms that allowed to reproduce natural phenomena, giving place to a realistic and programmable simulation of the physical world. With the implementation of optical and mechanical sensors able to follow and translate into data the position, movement and shape of the human body, the first virtual reality systems appeared. Among them, Myron Krueger’s 1975 Videoplace, which uses a visual recognition system to translate the user’s silhouette and movements into a simulated space where users can interact with graphic objects around them. Before that, in 1966, Ivan Sutherland had released a virtual reality headset that tracked the position of the user’s head to modify the three-dimensional space in relation to the user’s field of vision. All these systems allowed for humans to interact in a virtual environment recreated by a machine.




In the time Jack was evolving at the work stations of industrial laboratories, the personal computer revolution took place. Computers had entered the homes of a new, global community of users who were connected to the same network and whose capacity to process and store data was constantly improving. This situation became the access door to a new and open communication space destined to improve the intellectual capacity of its users. In a way, the personal computer revolution served as an external memory where data, digitalised and indexed in multimedia documents, was collectively produced, globally distributed and constantly re-evaluated. Thanks to the User’s Graphic Interface, Jack could be modified visually in real time. In this environment, Jack was adapted to be stored on the Internet, where he would be accessible from any desktop computer. On the Internet, Jack would become an extension of his cyberspace users, shifting from a standardised representation of a human to an avatar. Jack was the substitute of a human being. He could be controlled with the mouse and incorporate social features such as physical appearance, clothing and gestures. All of that turned Jack into an ever-changing identity that evolved in dialogue with its user as well as by encountering other avatars across the virtual worlds that emerged in the new space originated by communication networks. This substitute translated the complexity of a body into binary code that could be modified and represented on screen, making the user feel immersed in new spaces of hypertext games rendered in real time and modified by the user’s actions. Virtual worlds arose in the form of computer games and online communities, such as Second Life, where users could be whatever and whoever they wanted and were controlled by hackable codes. Inhabitable dystopias were designed within the infinite possibilities of the virtual world and triggered our desire to be connected, a desire that saw its best expression in the new cyberspace neuromancers.

Cyberpunks replaced their nihilist ancestor’s slogan “There’s No Future” for “The Future Is Now”. It was a fractal and mutant future, full of possibilities and projected from a new, free and autonomous area. Internet was conceived as a decentralised place where information wanted to be free and where navigating on the edges meant seeking out the other, understood as a source of innovation. In this quest for an intimate connection within virtual boundaries, the most daring representatives for the movement came up with the possibility of codifying brain activity into data that could be interpreted by a computer. Among the promoters of this brain-to-hardware connection, known as wetware, was the psycho-engineer Masahiro Kahata, who developed the Interactive Brainwave Visual Analyzer (IBVA). With the help of an electrode strapped to the user’s forehead, the device captures electric waves produced by brain activity and translates them into colourful 3D graphics on the computer screen. Although IBVA still lacks the means to transfer the user’s mind into cyberspace, it shows new interactive possibilities between humans and machines; namely, the possibility of having an interface directly connected to the nervous system. Around that same time, the company Bio Control Systems developed the BioMuse Computer, aimed at codifying these wave variations into data that can be programmed to send specific commands to the computer. This technology allowed the user to mentally control the computer, an advancement that led the AquaThought Foundation to develop MindSet, a device that uses these wave intensities to map the brain and its functions.




In other fields, using sensors to gather data from body impulses, movements and shapes has given place to the development of bionics, which seeks to help overcome certain disabilities and enhance human actions by connecting artificial systems to the body. The concept of cyborg, or cybernetic organism, was coined by Manfred Clynes and Nathan Kline in their 1960 article Cyborgs and Space, where they suggested adding prostheses and mechanical alterations to the body in order to adapt it to diverse environments, such as outer space. Since ancient times, the development of mechanical prostheses has made it possible to help people for which the everyday environment represents a challenge. What characterises modern bionics is the use of computational systems that are able to process information, thus allowing for communication between the body and the machine and resulting in an integrated functioning. In the mid 90s, over 7000 people had recovered part of their hearing thanks to cochlear implants. These devices translate sounds into bioelectrical impulses that are then transmitted to the inner ear using electrodes. More recently, the interpretation of our nervous system impulses has made it possible to develop sensitive prostheses capable of responding to our wishes and even take into account certain aspects of the environment in order to adjust their efficiency. An example of this are the bionic legs developed by the Biomechatronics Research Group at MIT under the direction of Hugh Herr. These legs, created through a complex mathematical modelling of the bearer’s body and manufactured with intelligent material, are able to change their hardness and flexibility in response to electric impulse. Bionic legs are connected to the spinal chord with a sophisticated system of circuits and software that can interpret the nervous impulses to control complex tasks such as running, jumping and even dancing or climbing. In another field, the neuroprosthetics lab Bensmaialab produces mechanical limbs equipped with sensors that can provide naturalist sensations by stimulating the bearer’s cortical or peripheral neurons, trying to reproduce with as much resemblance as possible the patterns of neuronal activation involved in basic object handling.




These prostheses have widened their field of application so much that a paradigm shift has occurred: from the overcoming of disability to the enhancement of human skills. A company like Ekson, for instance, not only produces exoskeletons that allow tetraplegic people to stand up and walk (with the help of a microcircuits system that interprets and reproduces body movements), but it also focus its most lucrative projects on military investigation. One of those projects is Warrior Web, financed by DARPA (the US Defense Advanced Research Project Agency), which aims at creating a light-weight and low-consumption exoskeleton. Worn underneath the uniform and controlled by a computer integrated in the camouflage backpack, this exoskeleton obtains data regarding the soldier’s movement and it applies the necessary hydraulic power for the soldier to be able to walk or run faster with no additional effort. Such enhancement of the human body not only works by creating systems that improve the efficiency of our limbs; in some cases, it crosses the interspecies barrier, using data gathered from other organisms to model the prostheses that result in new and improved abilities. The cheetah legs of athlete Aimee Mullins are one example of this technology.

The appearance of cyborgs has made the human body a hackable item. The body is now an experimental area where data from different systems can be turned into a common code and then recombined, modelled and transferred from one system to another, opening up a path towards hybridisation and new potentialities. There is now room for creative intervention and design, not only by connecting electromechanical systems to the body, but also by delving into the very components of our body, especially after the 2003 appearance of bioprinters. Bioprinters are 3D printers capable of printing organic fabric from living cells and making live matter a material that can be modelled through computer design programmes. 3D printers allow the creation of organs and replacement tissue to fit the patient receiving them, and they can even integrate electronic components to produce functional organs. The ear recently developed by Princeton and John Hopkins University combines hydrogel for its structure, living cells to form the cartilage and silver nanoparticles to make an antenna that receives sound.

These technoscientific interventions on our life have led us to the Posthuman Condition. Humans have ceased to be a natural, isolated entity to become part of what Donna Haraway calls the nature-culture continuum, a relational understanding of our environment in which the distinction between natural and artificial no longer applies. Instead, humans are understood as a relational node, defined in relation to the systems that surround them and with which they interact on a daily basis. Cyborgs are not only produced by the prosthetic integration of artificial body parts, but also by the connection of workers to factory production lines (which determine the rhythm and scope of their movements), by the immersions of the body on a dance floor (where we are guided by the spectacle of light and electronic music), as well as by surfing the Internet, an act mediated by interfaces and virtual agents. These environments in which the body finds itself immersed through complex hybridisation determine the possibilities for action and perception, and are designed according to a standardisation of the body and its functions. The environment is then tailor-made for us, it responds to our needs following usefulness and efficiency criteria, enhancing our abilities at the same time as it redefines us as human beings.


Post related to the CCCB exhibition HUMAN+: The future of our species

All rights of this article reserved by the author

View comments0

Leave a comment

Avatars, Cyborgs and Robots: Can Humans Enhance Themselves? (I)