How To Trick Deep Learning Algorithms Into Doing New Things
TechTalks, July 20th, 2020
August 2, 2020,
Volume 268, Issue 5

Two things often mentioned with deep learning are 'data' and 'compute resources.' You need a lot of both when developing, training, and testing deep learning models

"When developers don't have a lot of training samples or access to very powerful servers, they use transfer learning to finetune a pre-trained deep learning model for a new task." writes Ben Dickson in TechTalks.

"At this year's ICML conference, scientists at IBM Research and Taiwan's National Tsing Hua University Research introduced 'black-box adversarial reprogramming' (BAR), an alternative repurposing technique that turns a supposed weakness of deep neural networks into a strength..."

Read More ...

Keywords:

 
Other articles in the IT News - AI section of Volume 268, Issue 5:

See all archived articles in the IT News - AI section.