Johannes Hirth
Raum 0443
Universität Kassel
Fachbereich Elektrotechnik/Informatik
Fachgebiet Wissensverarbeitung
Wilhelmshöher Allee 73
34121 Kassel
Tel.: +49 561 804-6350
Fax.: +49 561 804-6259
Email: hirth@cs.uni-kassel.de (PGP-Key: 0x2D9F0D2E01928BC8)
Publications
1.
Hanika, T., Hirth, J.: Knowledge cores in large formal contexts. Annals of Mathematics and Artificial Intelligence. (2022).
URLBibTeXEndNoteDOIBibSonomy
Knowledge computation tasks, such as computing a base of valid implications, are often infeasible for large data sets. This is in particular true when deriving canonical bases in formal concept analysis (FCA). Therefore, it is necessary to find techniques that on the one hand reduce the data set size, but on the other hand preserve enough structure to extract useful knowledge. Many successful methods are based on random processes to reduce the size of the investigated data set. This, however, makes them hardly interpretable with respect to the discovered knowledge. Other approaches restrict themselves to highly supported subsets and omit rare and (maybe) interesting patterns. An essentially different approach is used in network science, called k-cores. These cores are able to reflect rare patterns, as long as they are well connected within the data set. In this work, we study k-cores in the realm of FCA by exploiting the natural correspondence of bi-partite graphs and formal contexts. This structurally motivated approach leads to a comprehensible extraction of knowledge cores from large formal contexts.
@article{Hanika2022,
abstract = {Knowledge computation tasks, such as computing a base of valid implications, are often infeasible for large data sets. This is in particular true when deriving canonical bases in formal concept analysis (FCA). Therefore, it is necessary to find techniques that on the one hand reduce the data set size, but on the other hand preserve enough structure to extract useful knowledge. Many successful methods are based on random processes to reduce the size of the investigated data set. This, however, makes them hardly interpretable with respect to the discovered knowledge. Other approaches restrict themselves to highly supported subsets and omit rare and (maybe) interesting patterns. An essentially different approach is used in network science, called k-cores. These cores are able to reflect rare patterns, as long as they are well connected within the data set. In this work, we study k-cores in the realm of FCA by exploiting the natural correspondence of bi-partite graphs and formal contexts. This structurally motivated approach leads to a comprehensible extraction of knowledge cores from large formal contexts.},
author = {Hanika, Tom and Hirth, Johannes},
journal = {Annals of Mathematics and Artificial Intelligence},
keywords = {2022 bigdata bipartite cores fca itegpub k-cores kde kdepub myown publist},
month = {apr},
title = {Knowledge cores in large formal contexts},
year = 2022
}
%0 Journal Article
%1 Hanika2022
%A Hanika, Tom
%A Hirth, Johannes
%D 2022
%J Annals of Mathematics and Artificial Intelligence
%R 10.1007/s10472-022-09790-6
%T Knowledge cores in large formal contexts
%U https://doi.org/10.1007/s10472-022-09790-6
%X Knowledge computation tasks, such as computing a base of valid implications, are often infeasible for large data sets. This is in particular true when deriving canonical bases in formal concept analysis (FCA). Therefore, it is necessary to find techniques that on the one hand reduce the data set size, but on the other hand preserve enough structure to extract useful knowledge. Many successful methods are based on random processes to reduce the size of the investigated data set. This, however, makes them hardly interpretable with respect to the discovered knowledge. Other approaches restrict themselves to highly supported subsets and omit rare and (maybe) interesting patterns. An essentially different approach is used in network science, called k-cores. These cores are able to reflect rare patterns, as long as they are well connected within the data set. In this work, we study k-cores in the realm of FCA by exploiting the natural correspondence of bi-partite graphs and formal contexts. This structurally motivated approach leads to a comprehensible extraction of knowledge cores from large formal contexts.
1.
Hanika, T., Hirth, J.: Quantifying the Conceptual Error in Dimensionality Reduction. In: Braun, T., Gehrke, M., Hanika, T., and Hernandez, N. (eds.) Graph-Based Representation and Reasoning - 26th International Conference on Conceptual Structures, ICCS 2021, Virtual Event, September 20-22, 2021, Proceedings. pp. 105–118. Springer (2021).
URLBibTeXEndNoteDOIBibSonomy
@inproceedings{DBLP:conf/iccs/HanikaH21,
author = {Hanika, Tom and Hirth, Johannes},
booktitle = {Graph-Based Representation and Reasoning - 26th International Conference on Conceptual Structures, ICCS 2021, Virtual Event, September 20-22, 2021, Proceedings},
editor = {Braun, Tanya and Gehrke, Marcel and Hanika, Tom and Hernandez, Nathalie},
keywords = {2021 itegpub kde kdepub myown scale-measure scaling},
pages = {105--118},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
title = {Quantifying the Conceptual Error in Dimensionality Reduction},
volume = 12879,
year = 2021
}
%0 Conference Paper
%1 DBLP:conf/iccs/HanikaH21
%A Hanika, Tom
%A Hirth, Johannes
%B Graph-Based Representation and Reasoning - 26th International Conference on Conceptual Structures, ICCS 2021, Virtual Event, September 20-22, 2021, Proceedings
%D 2021
%E Braun, Tanya
%E Gehrke, Marcel
%E Hanika, Tom
%E Hernandez, Nathalie
%I Springer
%P 105--118
%R 10.1007/978-3-030-86982-3_8
%T Quantifying the Conceptual Error in Dimensionality Reduction
%U https://doi.org/10.1007/978-3-030-86982-3_8
%V 12879
1.
Hanika, T., Hirth, J.: Exploring Scale-Measures of Data Sets. In: Braud, A., Buzmakov, A., Hanika, T., and Ber, F.L. (eds.) Formal Concept Analysis - 16th International Conference, ICFCA 2021, Strasbourg, France, June 29 - July 2, 2021, Proceedings. pp. 261–269. Springer (2021).
URLBibTeXEndNoteDOIBibSonomy
@inproceedings{DBLP:conf/icfca/HanikaH21,
author = {Hanika, Tom and Hirth, Johannes},
booktitle = {Formal Concept Analysis - 16th International Conference, ICFCA 2021, Strasbourg, France, June 29 - July 2, 2021, Proceedings},
editor = {Braud, Agnès and Buzmakov, Aleksey and Hanika, Tom and Ber, Florence Le},
keywords = {closure closure-system exploring fca lattice myown scaling},
pages = {261--269},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
title = {Exploring Scale-Measures of Data Sets},
volume = 12733,
year = 2021
}
%0 Conference Paper
%1 DBLP:conf/icfca/HanikaH21
%A Hanika, Tom
%A Hirth, Johannes
%B Formal Concept Analysis - 16th International Conference, ICFCA 2021, Strasbourg, France, June 29 - July 2, 2021, Proceedings
%D 2021
%E Braud, Agnès
%E Buzmakov, Aleksey
%E Hanika, Tom
%E Ber, Florence Le
%I Springer
%P 261--269
%R 10.1007/978-3-030-77867-5_17
%T Exploring Scale-Measures of Data Sets
%U https://doi.org/10.1007/978-3-030-77867-5_17
%V 12733
1.
Hanika, T., Hirth, J.: On the Lattice of Conceptual Measurements. (2020).
URLBibTeXEndNoteBibSonomy
We present a novel approach for data set scaling based on scale-measures from formal concept analysis, i.e., continuous maps between closure systems, and derive a canonical representation. Moreover, we prove said scale-measures are lattice ordered with respect to the closure systems. This enables exploring the set of scale-measures through by the use of meet and join operations. Furthermore we show that the lattice of scale-measures is isomorphic to the lattice of sub-closure systems that arises from the original data. Finally, we provide another representation of scale-measures using propositional logic in terms of data set features. Our theoretical findings are discussed by means of examples.
@article{hanika2020lattice,
abstract = {We present a novel approach for data set scaling based on scale-measures from formal concept analysis, i.e., continuous maps between closure systems, and derive a canonical representation. Moreover, we prove said scale-measures are lattice ordered with respect to the closure systems. This enables exploring the set of scale-measures through by the use of meet and join operations. Furthermore we show that the lattice of scale-measures is isomorphic to the lattice of sub-closure systems that arises from the original data. Finally, we provide another representation of scale-measures using propositional logic in terms of data set features. Our theoretical findings are discussed by means of examples.},
author = {Hanika, Tom and Hirth, Johannes},
keywords = {closure-system fca lattice measurement myown scale-measure},
note = {cite arxiv:2012.05267Comment: 19 pages, 6 figures},
title = {On the Lattice of Conceptual Measurements},
year = 2020
}
%0 Journal Article
%1 hanika2020lattice
%A Hanika, Tom
%A Hirth, Johannes
%D 2020
%T On the Lattice of Conceptual Measurements
%U https://arxiv.org/abs/2012.05267
%X We present a novel approach for data set scaling based on scale-measures from formal concept analysis, i.e., continuous maps between closure systems, and derive a canonical representation. Moreover, we prove said scale-measures are lattice ordered with respect to the closure systems. This enables exploring the set of scale-measures through by the use of meet and join operations. Furthermore we show that the lattice of scale-measures is isomorphic to the lattice of sub-closure systems that arises from the original data. Finally, we provide another representation of scale-measures using propositional logic in terms of data set features. Our theoretical findings are discussed by means of examples.
1.
Hanika, T., Hirth, J.: Conexp-Clj - A Research Tool for FCA. In: Cristea, D., Ber, F.L., Missaoui, R., Kwuida, L., and Sertkaya, B. (eds.) ICFCA (Supplements). pp. 70–75. CEUR-WS.org (2019).
URLBibTeXEndNoteBibSonomy
@inproceedings{conf/icfca/HanikaH19,
author = {Hanika, Tom and Hirth, Johannes},
booktitle = {ICFCA (Supplements)},
crossref = {conf/icfca/2019suppl},
editor = {Cristea, Diana and Ber, Florence Le and Missaoui, Rokia and Kwuida, Léonard and Sertkaya, Baris},
keywords = {clojure conexp fca itegpub kde kdepub myown},
pages = {70-75},
publisher = {CEUR-WS.org},
series = {CEUR Workshop Proceedings},
title = {Conexp-Clj - A Research Tool for FCA.},
volume = 2378,
year = 2019
}
%0 Conference Paper
%1 conf/icfca/HanikaH19
%A Hanika, Tom
%A Hirth, Johannes
%B ICFCA (Supplements)
%D 2019
%E Cristea, Diana
%E Ber, Florence Le
%E Missaoui, Rokia
%E Kwuida, Léonard
%E Sertkaya, Baris
%I CEUR-WS.org
%P 70-75
%T Conexp-Clj - A Research Tool for FCA.
%U http://dblp.uni-trier.de/db/conf/icfca/icfca2019suppl.html#HanikaH19
%V 2378
Talks
- September 2021: ‚Quantifying the Conceptual Error in Dimensionality Reduction‘, Applications of Formal Sciences: Explainable AI, Dagstuhl
- March 2021: ‚Discovery of Conceptual Measurements/Entdecken Begrifflicher Messungen‘, Application of Formal Sciences — Explainable Artificial Intelligence, Dagstuhl
- September 2020: ‚Navigating Conceptual Measurements‘, Dagstuhl Meeting on the Application of Formal Sciences — Knowledge Engineering
Reviewing
- Subreviewer: ICCS, 12–15 September 2022, Münster, Germany
- Subreviewer: ECML PKDD, 19–23 September 2022, Grenoble, France
- Subreviewer: FCA4AI-2021, 21 August 2021, Montréal, Canada
- Subreviewer: ECML PKDD, 13–17 September 2021
- Subreviewer: 17th Russian Conference on Artificial Intelligence, 21–25 October, 2019, Ulyanovsk, Russia
- Subreviewer: 11th ACM conference on Web Science, June 30–July 3, 2019, Boston, MA, USA
Teaching
Winter Term 2022:
- Placeholder
- Placeholder
- Placeholder
Projects
Accompanying my theoretical research, there are two projects I mainly work for:
Conexp-Clj — Scale-Exploration Standalone
Accompanying our research on conceptual measurements, we implemented the exploration of scale-measures algorithm in conexp-cjl and provided a standalone (Ready for Use) version of the algorithm. Associated Papers:
- Hanika, T., Hirth, J.: Quantifying the Conceptual Error in Dimensionality Reduction. In: Braun, T., Gehrke, M., Hanika, T., and Hernandez, N. (eds.) Graph-Based Representation and Reasoning – 26th International Conference on Conceptual Structures, ICCS 2021, Virtual Event, September 20-22, 2021, Proceedings. pp. 105–118. Springer (2021).
- Hanika, T., Hirth, J.: Exploring Scale-Measures of Data Sets. In: Braud, A., Buzmakov, A., Hanika, T., and Ber, F.L. (eds.) Formal Concept Analysis – 16th International Conference, ICFCA 2021, Strasbourg, France, June 29 – July 2, 2021, Proceedings. pp. 261–269. Springer (2021).
- Hanika T., Hirth J.: On the Lattice of Conceptual Measurements submitted.
Conexp-Clj — A Research Tool for FCA
The research unit Knowledge & Data Engineering continues the development of the research tool conexp-clj, originally created by Dr. Daniel Borchmann. The continuous enhancement of the software package is supervised by Dr. Tom Hanika. Having such a tool at hand, the research group is able to test and analyze the theoretical research efforts in the realm of formal concept analysis and related fields. The most recent, pre-compiled, release candidate can be downloaded here.
A presentation of the tool can be found in Conexp-Clj – A Research Tool for FCA

BibSonomy
BibSonomy is a scholarly social bookmarking system where researchers manage their collections of publications and web pages. BibSonomy is an open source project, continously developed by researchers in Kassel, Würzburg, and Hanover. Functioning as a test bed for recommendation and ranking algorithms, as well as through the publicly available datasets, containing traces of user behavior on the Web, BibSonomy has been the subject of various scientific studies.
About BibSonomy Blog Open Source Repo Twitter
Im Projekt faire digitale Dienste: „Ko-Valuation in der Gestaltung datenökonomischer Geschäftsmodelle (FAIRDIENSTE)“ wird ein interdisziplinärer Ansatz verfolgt, der sowohl soziologische als auch (wirtschafts-)informatische Aspekte beinhaltet. Es werden faire Geschäftsmodelle untersucht, die auf Kooperation und Wertevermittlung zielen.
Ein Ziel der Arbeit ist die Weiterentwicklung informatischer Methoden zur qualitativen Datenanalyse, welche die an den Kundenschnittstellen digitaler Dienste auftretende Konfliktlandschaft transparent machen und die für Verbraucher*innen eine kritische Beurteilung verschiedener Wertgesichtspunkte ermöglichen soll.