A Matter of Time: Faster Percolator Analysis via Efficient SVM Learning for Large-Scale Proteomics

John T. Halloran, David M Rocke

Research output: Contribution to journalArticle

2 Scopus citations

Abstract

Percolator is an important tool for greatly improving the results of a database search and subsequent downstream analysis. Using support vector machines (SVMs), Percolator recalibrates peptide-spectrum matches based on the learned decision boundary between targets and decoys. To improve analysis time for large-scale data sets, we update Percolator's SVM learning engine through software and algorithmic optimizations rather than heuristic approaches that necessitate the careful study of their impact on learned parameters across different search settings and data sets. We show that by optimizing Percolator's original learning algorithm, l2-SVM-MFN, large-scale SVM learning requires nearly only a third of the original runtime. Furthermore, we show that by employing the widely used Trust Region Newton (TRON) algorithm instead of l2-SVM-MFN, large-scale Percolator SVM learning is reduced to nearly only a fifth of the original runtime. Importantly, these speedups only affect the speed at which Percolator converges to a global solution and do not alter recalibration performance. The upgraded versions of both l2-SVM-MFN and TRON are optimized within the Percolator codebase for multithreaded and single-thread use and are available under Apache license at bitbucket.org/jthalloran/percolator-upgrade.

Original languageEnglish (US)
Pages (from-to)1978-1982
Number of pages5
JournalJournal of Proteome Research
Volume17
Issue number5
DOIs
StatePublished - May 4 2018

Keywords

  • machine learning
  • percolator
  • support vector machine
  • tandem mass spectrometry
  • TRON

ASJC Scopus subject areas

  • Biochemistry
  • Chemistry(all)

Fingerprint Dive into the research topics of 'A Matter of Time: Faster Percolator Analysis via Efficient SVM Learning for Large-Scale Proteomics'. Together they form a unique fingerprint.

  • Cite this