Non-Fiction Books:

Composing Fisher Kernels from Deep Neural Models

A Practitioner's Approach

Customer rating

Click to share your rating 0 ratings (0.0/5.0 average) Thanks for your vote!

Share this product

Composing Fisher Kernels from Deep Neural Models by Tayyaba Azim
Save $56.00
$95.99 was $151.99
or 4 payments of $24.00 with Learn more
Releases

Pre-order to reserve stock from our first shipment. Your credit card will not be charged until your order is ready to ship.

Available for pre-order now
Pre-order Price Guarantee

If you pre-order an item and the price drops before the release date, you’ll pay the lowest price. This happens automatically when you pre-order and pay by credit card.

If paying by PayPal or internet banking, and the price drops after you have paid, you can ask for the difference to be refunded. Find out more

If Mighty Ape's price changes before release, you'll pay the lowest price.

Availability

This product will be released on

Delivering to:

It should arrive:

  • 27 Sep to 2 Oct using standard courier delivery

Description

This book shows machine learning enthusiasts and practitioners how to get the best of both worlds by deriving Fisher kernels from deep learning models. In addition, the book shares insight on how to store and retrieve large-dimensional Fisher vectors using feature selection and compression techniques. Feature selection and feature compression are two of the most popular off-the-shelf methods for reducing data's high-dimensional memory footprint and thus making it suitable for large-scale visual retrieval and classification. Kernel methods long remained the de facto standard for solving large-scale object classification tasks using low-level features, until the revival of deep models in 2006. Later, they made a comeback with improved Fisher vectors in 2010. However, their supremacy was always challenged by various versions of deep models, now considered to be the state of the art for solving various machine learning and computer vision tasks. Although the two research paradigms differ significantly, the excellent performance of Fisher kernels on the Image Net large-scale object classification dataset has caught the attention of numerous kernel practitioners, and many have drawn parallels between the two frameworks for improving the empirical performance on benchmark classification tasks. Exploring concrete examples on different data sets, the book compares the computational and statistical aspects of different dimensionality reduction approaches and identifies metrics to show which approach is superior to the other for Fisher vector encodings. It also provides references to some of the most useful resources that could provide practitioners and machine learning enthusiasts a quick start for learning and implementing a variety of deep learning models and kernel functions.

Author Biography

Dr. Tayyaba Azim is an Assistant Professor at the Center for Information Technology, Institute of Management Sciences, Peshawar, Pakistan. Sarah Ahmed is a current research student enrolled in Masters of Computer Science program at Institute of Management Sciences Peshawar, Pakistan. She has received her Bachelor's Degree in Computer Science from Edwardes College, Peshawar,Pakistan. Her areas of interest include: Machine Learning, Computer Vision and Data-Science. Currently, her research work is centered around the feature compression and selection approaches for Fisher vectors derived from deep neural models. Her research paper: "Compression techniques for Deep Fisher Vectors" was awarded the best paper in the area of applications at ICPRAM conference 2017.
Release date Australia
September 25th, 2018
Pages
59
Edition
1st ed. 2018
Illustrations
5 Illustrations, color; 1 Illustrations, black and white; XIII, 59 p. 6 illus., 5 illus. in color.
Country of Publication
Switzerland
Imprint
Springer International Publishing AG
ISBN-13
9783319985237
Product ID
28247131

Customer previews

Nobody has previewed this product yet. You could be the first!

Write a Preview

Help & options

  • If you think we've made a mistake or omitted details, please send us your feedback. Send Feedback
  • If you have a question or problem with this product, visit our Help section. Get Help
  • Seen a lower price for this product elsewhere? We'll do our best to beat it. Request a better price
Filed under...