XRDS

Crossroads The ACM Magazine for Students

Sign In

Association for Computing Machinery

Articles Tagged: Neural networks

Articles & Features

Finding the edge: Art and automation

COLUMN: Letter from the editors

Finding the edge: Art and automation

By Jennifer Jacobs, April 2018

PDF | HTML | In the Digital Library

Computers and art in the age of machine learning

COLUMN: INIT

Computers and art in the age of machine learning

By Emily L. Spratt, April 2018

PDF | HTML | In the Digital Library

The Replica Project: Building a visual search engine for art historians

SECTION: Features

The Replica Project: Building a visual search engine for art historians

From the time of prehistoric etchings on the walls of the Lascaux cave to the present day, people have always been creating art. With millions of artistic artifacts filling museums, churches, cultural institutions, and private collections across the globe, connecting to our shared cultural and artistic past is no longer impossible.

By Benoit Seguin, April 2018

PDF | HTML | In the Digital Library

The burgeoning computer-art symbiosis

The burgeoning computer-art symbiosis

Computers help us understand art. Art helps us teach computers.

By Shiry Ginosar, Xi Shen, Karan Dwivedi, Elizabeth Honig, Mathieu Aubry, April 2018

PDF | HTML | In the Digital Library

Creation, curation, and classification: Mario Klingemann and Emily L. Spratt in conversation

Creation, curation, and classification: Mario Klingemann and Emily L. Spratt in conversation

Computer-generated art has long challenged traditional notions of the role of the artist and the curator in the creative process. In the age of machine learning these philosophical conceptions require even further consideration.

By Emily L. Spratt, April 2018

PDF | HTML | In the Digital Library

DEPARTMENT: Updates

"ANN" helps Mario rescue Princess Toadstool

By Daniel López Sánchez, September 2016

PDF | HTML | In the Digital Library

Profile: Geoffrey Hinton<br />Unlocking the language of the brain

SECTION: Features

Profile: Geoffrey Hinton
Unlocking the language of the brain

By Adrian Scoică, October 2014

PDF | HTML | In the Digital Library

Storage capacity comparison of neural network models for memory recall

The physiology of how the human brain recalls memories is not well understood. Neural networks have been used in an attempt to model this process.

Two types of networks have been used in several models of temporal sequence memory for simple sequences of randomly generated and also of structured patterns: auto- and hetero-associative networks. Previous work has shown that a model with coupled auto- and hetero-associative continuous attractor networks can robustly recall learned simple sequences. In this paper, we compare Hebbian learning and pseudo-inverse learning in a model for recalling temporal sequences in terms of their storage capacities. The pseudo-inverse learning method is shown to have a much higher storage capacity, making the new network model 700% more efficient by reducing calculations.

By Kate Patterson, December 2007

PDF | HTML | In the Digital Library

Using perception in managing unstructured documents

By Ching Kang Cheng, Xiaoshan Pan, December 2003

PDF | HTML | In the Digital Library