Human microchipping is a technology that involves inserting a small, rice-sized chip under the skin of a human. One modern function of the chip is that it can have the power to do credit card transactions along with tracking the health and location of the user.

A human microchip implant is typically an identifying integrated circuit device or RFID transponder encased in silicate glass and implanted in the body of a human being. This type of subdermal implant usually contains a unique ID number that can be linked to information contained in an external database, such as personal identification, law enforcement, medical history, medications, allergies, and contact information.

A microchip, smaller than your fingernail, contains computer circuitry called an integrated circuit. The invention of the integrated circuit stands historically as one of the most important innovations of mankind. Almost all modern products use chip technology.

The pioneers known for inventing microchip technology are Jack Kilby and Robert Noyce. In 1959, Kilby of Texas Instruments received a U.S. patent for miniaturized electronic circuits, and Noyce of Fairchild Semiconductor Corporation received a patent for a silicon-based integrated circuit.

The first experiments with radio-frequency identification (RFID) implant were carried out in 1998 by the British scientist Kevin Warwick. His implant was used to open doors, switch on lights, and cause verbal output within a building. After nine days the implant was removed and has since been held in the Science Museum in London.

Microchips are built layer by layer on a wafer of semiconductor material, like silicon. The layers are built by a process called photolithography, which uses chemicals, gases, and light. Conducting paths between the components are created by overlaying the chip with a thin layer of metal, usually aluminum. The photolithography and etching processes are used to remove the metal leaving only the conducting pathways.

Today, microchips are used in smartphones to let people access the internet, and are also used in televisions, GPS, and in medicine. While the human microchipping trend is starting to take place, people have been microchipping pets for many years. Ever since 1996 when pet microchipping started over four million pets have been microchipped, and with only 391 incidents of faulty chips, humans decided to take a gander with the technology.