Global Versus Local Symmetries, 2023
...

Symmetry has been a cornerstone of human thought and aesthetics since ancient times in various civilizations. While the ancient interpretation of symmetry encompassed the idea of equal arrangement and proportion, the modern understanding is limited to the set of transformations that leave the object invariant. We investigate the concept of partial (local) symmetry, which may be viewed as a sort of return to the original meaning of the term symmetry, stressing the importance of proportionality but capturing the current meaning of symmetry as well. Moreover, we investigate its significance in various disciplines, such as neuroaesthetics and mathematics. Furthermore, we argue that the concept of local (partial) symmetry, as opposed to global (total) symmetry, is more natural, more general, and better describes natural phenomena and symmetries in abstract structures.

@article{pastorekGlobalLocalSymmetries2023,
  title = {Global Versus Local Symmetries},
  booktitle = {Proceedings of the MEi: CogSci Conference},
  author = {Pastorek, Jan},
  year = {2023},
  volume = {17},
  publisher = {Comenius University in Bratislava},
  isbn = {978-80-223-5633-6},
  langid = {english},
  url = {https://journals.phl.univie.ac.at/meicogsci/article/view/571}
}

Semantic Primitives in Word Embeddings, 2023
...

Semantic primitives are the core concepts that possibly all humans share. They cannot be defined by any other concepts, for the chain of definitions ends in them. Finding such a set would provide us with a common communication “mother language”. We could use such a set to communicate ethical norms to less developed communities [1]. The list of such primes is already stable, numbering 65 in total including words such as TRUE, GOOD, NOT, YOU, etc. Modern NLP models can capture the semantic similarity of words based on statistical co-occurrences of words. Such models create global embeddings, vectors for each word that occurs in the training where words that co-occur in similar contexts should occupy a similar place in the vector space [2]. The vector spaces produced by these models are based on co-occurrence statistics, and the models do not explicitly encode the fundamental semantic properties associated with semantic primitives. Do the vectors corresponding to semantic primitives emerge near mathematically special regions in the vector spaces of NLP models, despite their lack of explicit encoding in those places? In other words, are the primes close to SVD singular vectors, PCA components, or K-Means cluster centers?

@article{pastorekSemanticPrimitivesWord2023,
  title = {Semantic Primitives in Word Embeddings},
  booktitle = {Proceedings of the MEi: CogSci Conference},
  author = {Pastorek, Jan},
  year = {2023},
  number = {1},
  publisher = {Comenius University in Bratislava},
  address = {Budapest},
  isbn = {978-80-223-5633-6},
  langid = {english},
  url = {https://journals.phl.univie.ac.at/meicogsci/article/view/567},
}

Unraveling the Hidden Influence of Ernst Mach on the Foundations of Cognitive Science - Interdisciplinary Approach, 2023
...

Existing narratives often overlook the significant impact of Ernst Mach and the Vienna Circle on the foundations of cognitive science. In this study, we delve into the underexplored influence of Mach’s theories on the emergence of cognitive science, employing a unique interdisciplinary approach that blends rigorous argumentation with cutting-edge computational methods in network science and natural language processing. Our findings reveal multiple, previously unrecognized pathways of influence from Mach to pivotal figures in cognitive science, thereby showcasing the efficacy of our combined approach in illuminating the intricate web of intellectual connections. This innovative method offers valuable insights into tracing the potential influences of key thinkers, addressing a longstanding challenge in the history of science arising from the ever-growing corpus of academic literature. To our knowledge, this is one of the first papers to use both citation networks and natural language processing for the investigations of the history of cognitive science.

@article{pastorekUnravelingHiddenInfluence2023,
  title = {Unraveling the Hidden Influence of Ernst Mach on the Foundations of Cognitive Science - Interdisciplinary Approach},
  booktitle = {Kognicia a Umely Zivot 2023},
  author = {Pastorek, Jan and Sarto-Jackson, Isabella},
  year = {2023},
  publisher = {Comenius University in Bratislava},
  isbn = {978-80-223-5610-7},
  langid = {english},
  url = {https://cogsci.fmph.uniba.sk/kuz2023/prispevky},
}