Least square support vector machine for the simultaneous learning of a function and its derivative

Rui Sheng Zhang, Guozhen Liu

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contributionpeer-review

Abstract

In this paper, the problem of simultaneously approximating a function and its derivatives is formulated. First, the problem is solved for a one-dimensional input space by using the least square support vector machines and introducing additional constraints in the approximation of the derivative. To optimize the regression estimation problem, we have derived an algorithm that works fast and more accuracy for moderate-size problems. The proposed method shows that using the information about derivatives significantly improves the reconstruction of the function.

Original languageEnglish
Title of host publicationAdvanced research on electronic commerce, web application, and communication
Subtitle of host publicationinternational conference, ECWAC 2011, Guangzhou, China, April 16-17, 2011. proceedings, part 1
EditorsGang Shen, Xiong Huang
Place of PublicationBerlin; Heidelberg
PublisherSpringer, Springer Nature
Pages427-433
Number of pages7
ISBN (Print)9783642203664
DOIs
Publication statusPublished - 2011
Externally publishedYes
EventInternational Conference on Advanced Research on Electronic Commerce, Web Application, and Communication - Guangzhou
Duration: 16 Apr 201117 Apr 2011

Publication series

NameCommunications in Computer and Information Science
PublisherSPRINGER-VERLAG BERLIN
Volume143
ISSN (Print)1865-0929

Conference

ConferenceInternational Conference on Advanced Research on Electronic Commerce, Web Application, and Communication
CityGuangzhou
Period16/04/1117/04/11

Keywords

  • SVM
  • SVR
  • LS-SVR
  • regression

Fingerprint

Dive into the research topics of 'Least square support vector machine for the simultaneous learning of a function and its derivative'. Together they form a unique fingerprint.

Cite this