He played an essential part in the first growth of BSD UNIX while a graduate student at Berkeley, and he’s the first writer of the vi text editor. He claimed that intelligent robots would replace mankind, in the least in intellectual and social dominance, in the comparatively near future. He recommend a position of relinquishment of GNR (genetics, nanotechnology, and robotics) technologies, instead of going into an arms race between adverse uses of the technology and defense against those adverse uses (great nano-machines patrolling and protecting against Grey Goo “poor” nano-machines). This position of comprehensive relinquishment was criticized by technologists including technological-singularity thinker Ray Kurzweil, who instead recommend fine grained relinquishment and ethical guidelines. Joy was likewise criticized by the conservative American Spectator, which qualified Joy’s essay as a (perhaps unwitting) justification for statism. A barroom discussion of the technologies with Ray Kurzweil began to establish Joy’s thinking along this path. He states in his essay that during the dialogue, he became surprised that other serious scientists were contemplating such chances probably, as well as more astounded at what he believed was a dearth of thoughts of the eventualities. After bringing the issue up with a couple more acquaintances, he says that he was further alarmed by what he believed was the very fact that although many individuals considered these futures potential or likely, that very few of them shared as serious a concern for the risks as he looked to. This concern led to his in depth assessment of the problem as well as the positions of others in the scientific community onto it, and eventually, to his present actions regarding it. Regardless of this, he’s a venture capitalist, investing in GNR technology firms. He’s also raised a forte enterprise fund to deal with the risks of pandemic diseases, including the H5N1 avian influenza and biological weapons.