Accession Number:

ADA488505

Title:

Nonextensive Entropic Kernels

Descriptive Note:

Technical rept.

Corporate Author:

CARNEGIE-MELLON UNIV PITTSBURGH PA SCHOOL OF COMPUTER SCIENCE

Report Date:

2008-08-01

Pagination or Media Count:

50.0

Abstract:

Positive definite kernels on probability measures have been recently applied in classification problems involving text, images, and other types of structured data. Some of these kernels are related to classic information theoretic quantities, such as Shannons mutual information and the Jensen-Shannon JS divergence. Meanwhile, there have been recent advances in nonextensive generalizations of Shannons information theory. This paper bridges these two trends by introducing nonextensive information theoretic kernels on probability measures, based on new JS-type divergences. These new divergences result from extending the two building blocks of the classical JS divergence convexity and Shannons entropy. The classical notion of convexity is extended to the wider concept of q-convexity, for which we prove a Jensen q-inequality. Based on this inequality, we introduce Jensen-Tsallis JT q-differences, a nonextensive generalization of the JS divergence, and define a k-th order JT q-difference between stochastic processes. We then define a new family of nonextensive mutual information kernels, which allow weights to be assigned to their arguments, and which includes the Boolean, JS, and linear kernels as particular cases. Nonextensive string kernels are also defined that subsume the p-spectrum kernel. We illustrate the performance of these kernels on text categorization tasks, in which documents are modeled both as bags-of-words and as sequences of characters.

Subject Categories:

  • Cybernetics

Distribution Statement:

APPROVED FOR PUBLIC RELEASE