sensenet

6 0
  • 0 Collaborators

sensorimotor and tactile simulator for deep learning ...learn more

Project status: Published/In Market

Virtual Reality, Robotics, Artificial Intelligence

Intel Technologies
Intel Opt ML/DL Framework

Code Samples [1]

Overview / Usage

The majority of artificial intelligence research, as it relates from which to biological senses has been focused on vision. The recent explosion of machine learning and in particular, deep learning, can be partially attributed to the release of high quality data sets for algorithms from which to model the world on. Thus, most of these datasets are comprised of images. We believe that focusing on sensorimotor systems and tactile feedback will create algorithms that better mimic human intelligence. Here we present SenseNet: a collection of tactile simulations and a large scale dataset of 3D objects for manipulation. SenseNet was created for the purpose of researching and training Artificial Intelligences (AIs) to interact with the environment via sensorimotor neural systems and tactile feedback. We aim to accelerate that same explosion in image processing, but for the domain of tactile feedback and sensorimotor research. We hope that SenseNet can offer researchers in both the machine learning and computational neuroscience communities brand new opportunities and avenues to explore.

Methodology / Approach

SenseNet is a sensorimotor and touch simulator to teach AIs how to interact with their environments via sensorimotor systems and touch neurons. SenseNet is meant as a research framework for machine learning researchers and theoretical computational neuroscientists. SenseNet can be used in reinforcement learning environments. The original code used OpenAI's gym as the base and so any code written for gym can be used with little to no tweaking of your code. Oftentimes you can just replace gym with sensenet and everything will work.

Technologies Used

tensorflow
pybullet
python
coach

Repository

http://github.com/jtoy/sensenet

Comments (0)