Algorithms for Scalable On-line Machine Learning on Regression Tasks

Please use this identifier to cite or link to this item:
https://osnadocs.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-201904251500
Open Access logo originally created by the Public Library of Science (PLoS)
Title: Algorithms for Scalable On-line Machine Learning on Regression Tasks
Authors: Schoenke, Jan H.
Thesis advisor: Prof. Dr. Joachim Hertzberg
Thesis referee: Prof. Dr. Eyke Hüllermeier
Abstract: In the realm of ever increasing data volume and traffic the processing of data as a stream is key in order to build flexible and scalable data processing engines. On-line machine learning provides powerful algorithms for extracting predictive models from such data streams even if the modeled relation is time-variant in nature. The modeling of real valued data in on-line regression tasks is especially important as it connects to modeling and system identification tasks in engineering domains and bridges to other fields of machine learning like classification and reinforcement learning. Therefore, this thesis considers the problem of on-line regression on time variant data streams and introduces a new multi resolution perspective for tackling it. The proposed incremental learning system, called AS-MRA, comprises a new interpolation scheme for symmetric simplicial input segmentations, a layered approximation structure of sequential local refinement layers and a learning architecture for efficiently training the layer structure. A key concept for making these components work together in harmony is a differential parameter encoding between subsequent refinement layers which allows to decompose the target function into independent additional components represented as individual refinement layers. The whole AS-MRA approach is designed to form a smooth approximation while having its computational demands scaling linearly towards the input dimension and the overall expressiveness and therefore potential storage demands scaling exponentially towards input dimension. The AS-MRA provides no mandatory design parameters, but offers opportunities for the user to state tolerance parameters for the expected prediction performance which automatically and adaptively shape the resulting layer structure during the learning process. Other optional design parameters allow to restrict the resource consumption with respect to computational and memory demands. The effect of these parameters and the learning behavior of the AS-MRA as such are investigated with respect to various learning issues and compared to different related on-line learning approaches. The merits and contributions of the AS-MRA are experimentally shown and linked to general considerations about the relation between key concepts of the AS-MRA and fundamental results in machine learning.
URL: https://osnadocs.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-201904251500
Subject Keywords: On-line Regression; Smooth Interpolation; Simplicial Segmentation; On-line Machine Learning
Issue Date: 25-Apr-2019
License name: Attribution-NonCommercial-NoDerivs 3.0 Germany
License url: http://creativecommons.org/licenses/by-nc-nd/3.0/de/
Type of publication: Dissertation oder Habilitation [doctoralThesis]
Appears in Collections:FB06 - E-Dissertationen

Files in This Item:
File Description SizeFormat 
thesis_schoenke.pdfPräsentationsformat1,85 MBAdobe PDF
thesis_schoenke.pdf
Thumbnail
View/Open


This item is licensed under a Creative Commons License Creative Commons