Proteomics: Capacity versus utility

Jenny L. Harry*, Marc R. Wilkins, Ben R. Herbert, Nicolle H. Packer, Andrew A. Gooley, Keith L. Williams

*Corresponding author for this work

Research output: Contribution to journalReview articlepeer-review

166 Citations (Scopus)


Until recently scientists studied genes or proteins one at a time. With improvements in technology, new tools have become available to study the complex interactions that occur in biological systems. Global studies are required to do this, and these will involve genomic and proteomic approaches. High-throughput methods are necessary in each case because the number of genes and proteins in even the simplest of organisms are immense. In the developmental phase of genomics, the emphasis was on the generation and assembly of large amounts of nucleic acid sequence data. Proteomics is currently in a phase of technological development and establishment, and demonstrating the capacity for high throughput is a major challenge. However, funding bodies (both in the public and private sector) are increasingly focused on the usefulness of this capacity. Here we review the current state of proteome research in terms of capacity and utility.

Original languageEnglish
Pages (from-to)1071-1081
Number of pages11
Issue number6
Publication statusPublished - 2000
Externally publishedYes


  • Automation
  • Function
  • Mass spectrometry
  • Protein delivery
  • Proteoinformatics
  • Proteomics
  • Review
  • Two-dimensional polyacrylamide gel electrophoresis


Dive into the research topics of 'Proteomics: Capacity versus utility'. Together they form a unique fingerprint.

Cite this