So the quest is cognitive computing which is about engineering aspects of mind such as emotion, perception, sensation, cognition, emotion, action, interaction by reverse engineering the brain and then to deploy this technology by connecting it to vast array of sensors, billions trillions of sensors such as sight, hearing, taste, touch and smell but even going further to non-biological sensors, sensors for example monitoring the forest, sensors monitoring the oceans, sensor moni... sensors monitoring people, animals, organizations, homes, cars and to stream this vast amount of data in real-time or near real-time to global brain that can extract patterns, large scale invariant patterns from the sensory overload and to act and respond to this dataIt should be noted that remote sensing can be used for monitoring purposes in addition to in situ sensors such as telemetry implants.
It may be that in future society nanobots will be used for high-resolution in situ brain monitoring and communication through something like 'global brain'. However, I tend to believe that implants are unnecessary for neural monitoring since it would mean that every synchronistic human event I have experienced was due to a select portion of people having nanorobotic implants. This would be unlikely if they are expensive and difficult to install on a wide scale. Neural implants themselves have been in use since the 1950s from rudimentary devices such as José Delgado's stimoceiver to modern-day electrode chips such as BrainGate.