This book constitutes the refereed proceedings of the Third European Conference on Computational Learning Theory, EuroCOLT'97, held in Jerusalem, Israel, in March 1997. The book presents 25 revised full papers carefully selected from a total of 36 high-quality submissions. The volume spans the whole spectrum of computational learning theory, with a certain emphasis on mathematical models of machine learning. Among the topics addressed are machine learning, neural nets, statistics, inductive inference, computational complexity, information theory, and theoretical physics.
This book constitutes the refereed proceedings of the Third European Conference on Computational Learning Theory, EuroCOLT'97, held in Jerusalem, Israel, in March 1997. The book presents 25 revised full papers carefully selected from a total of 36 high-quality submissions. The volume spans the whole spectrum of computational learning theory, with a certain emphasis on mathematical models of machine learning. Among the topics addressed are machine learning, neural nets, statistics, inductive inference, computational complexity, information theory, and theoretical physics.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Artikelnr. des Verlages: 10550390, 978-3-540-62685-5
1997.
Seitenzahl: 348
Erscheinungstermin: 3. März 1997
Englisch
Abmessung: 235mm x 155mm x 19mm
Gewicht: 450g
ISBN-13: 9783540626855
ISBN-10: 3540626859
Artikelnr.: 09218696
Herstellerkennzeichnung
Die Herstellerinformationen sind derzeit nicht verfügbar.
Inhaltsangabe
Sample compression, learnability, and the Vapnik-Chervonenkis dimension.- Learning boxes in high dimension.- Learning monotone term decision lists.- Learning matrix functions over rings.- Learning from incomplete boundary queries using split graphs and hypergraphs.- Generalization of the PAC-model for learning with partial information.- Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability.- Closedness properties in team learning of recursive functions.- Structural measures for games and process control in the branch learning model.- Learning under persistent drift.- Randomized hypotheses and minimum disagreement hypotheses for learning with noise.- Learning when to trust which experts.- On learning branching programs and small depth circuits.- Learning nearly monotone k-term DNF.- Optimal attribute-efficient learning of disjunction, parity, and threshold functions.- learning pattern languages using queries.- On fast and simple algorithms for finding Maximal subarrays and applications in learning theory.- A minimax lower bound for empirical quantizer design.- Vapnik-Chervonenkis dimension of recurrent neural networks.- Linear Algebraic proofs of VC-Dimension based inequalities.- A result relating convex n-widths to covering numbers with some applications to neural networks.- Confidence estimates of classification accuracy on new examples.- Learning formulae from elementary facts.- Control structures in hypothesis spaces: The influence on learning.- Ordinal mind change complexity of language identification.- Robust learning with infinite additional information.
Sample compression, learnability, and the Vapnik-Chervonenkis dimension.- Learning boxes in high dimension.- Learning monotone term decision lists.- Learning matrix functions over rings.- Learning from incomplete boundary queries using split graphs and hypergraphs.- Generalization of the PAC-model for learning with partial information.- Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability.- Closedness properties in team learning of recursive functions.- Structural measures for games and process control in the branch learning model.- Learning under persistent drift.- Randomized hypotheses and minimum disagreement hypotheses for learning with noise.- Learning when to trust which experts.- On learning branching programs and small depth circuits.- Learning nearly monotone k-term DNF.- Optimal attribute-efficient learning of disjunction, parity, and threshold functions.- learning pattern languages using queries.- On fast and simple algorithms for finding Maximal subarrays and applications in learning theory.- A minimax lower bound for empirical quantizer design.- Vapnik-Chervonenkis dimension of recurrent neural networks.- Linear Algebraic proofs of VC-Dimension based inequalities.- A result relating convex n-widths to covering numbers with some applications to neural networks.- Confidence estimates of classification accuracy on new examples.- Learning formulae from elementary facts.- Control structures in hypothesis spaces: The influence on learning.- Ordinal mind change complexity of language identification.- Robust learning with infinite additional information.
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497
USt-IdNr: DE450055826