Searchable List of Research Output

Filter Publications
  • Vitányi, P.M.B., Balbach, F.J., Cilibrasi, R.L., Li, M. (2008) Normalized information distance.
    Institute for Logic, Language and Computation.
  • Vitányi, P.M.B., Balbach, F.J., Cilibrasi, R.L., Li, M. (2009) Normalized information distance.
    In Emmert-Streib, F. Dehmer, M. (Eds.), Information theory and statistical learning (pp 45-82). Springer.
  • Vitányi, P.M.B., Buhrman, H., Tromp, J.A.H. (2001) Time and space bounds for reversible simulation.
    In Proc. ICALP 2001, Lecture Notes in Computer Science (pp 1017-1027). Springer Verlag.
    Conference contribution | UvA-DARE
  • Vitányi, P.M.B., Chater, N. (2013) Algorithmic Identification of Probabilities.
    ArXiv.
  • Vitányi, P.M.B., Chater, N. (2017) Identification of probabilities.
    Journal of Mathematical Psychology, Vol. 76 (pp 13-24)
  • Vitányi, P.M.B., Cilibrasi, R.L. (2007) The Google similarity distance.
    IEEE Transactions on Knowledge & Data Engineering, Vol. 19 (pp 370-383)
  • Vitányi, P.M.B., Cilibrasi, R.L. (2010) Normalized web distance and word similarity.
    In Indurkhya, N. Damerau, F.J. (Eds.), Handbook of natural language processing (pp 293-314) (Chapman & Hall/CRC machine learning & pattern recognition series). CRC Press.
  • Vitányi, P.M.B., Gács, P., Tromp, J.A.H. (2001) Algorithmic statistics.
    IEEE Transactions on Information Theory, Vol. 47 (pp 2443-2463)
  • Vitányi, P.M.B., Li, M. (1995) Algorithmic arguments in physics of computation.
    In Proc. 4th Workshop on Algorithms and Data Structures Kingston, Ontario (pp 315-333) (Lecture Notes in Computer Science). Springer-Verlag.
    Chapter | UvA-DARE
  • Vitányi, P.M.B., Li, M. (1996) Reversible simulation of irreversible computation.
    In Proc. 11th IEEE Conference on Computational Complexity (pp 301-306)
    Chapter | UvA-DARE
  • Vitányi, P.M.B., Li, M. (1996) Average-Case Analysis Using Kolmogorov Complexity.
    In Ron Book Memorial Volume. Springer-Verlag.
    Chapter | UvA-DARE
  • Vitányi, P.M.B., Li, M. (1997) On prediction by data compression.
    In Proc. 9th European Conference on Machine Learning (pp 14-30) (Lecture Notes in Artificial Intelligence). Springer-Verlag.
    Chapter | UvA-DARE
  • Vitányi, P.M.B., Li, M. (2000) Minimum Description Length Induction, Baysianism, and Kolmogorov Complexity.
    IEEE Transactions on Information Theory, Vol. 46 (pp 446-464)
  • Vitányi, P.M.B., Li, M. (2002) Simplicity, information, Kolmogorov complexity and prediction.
    In Zellner, A. Keuzenkamp, H.A. Mc Aleer, M. (Eds.), Simplicity, Inference and Medelling (pp 135-155). Cambridge University Press.
    Chapter | UvA-DARE
  • Vitányi, P.M.B., Verschaging, N. (2003) Kolmogorov's structure functions, nonprobabilistic statistics and the foundation of model selection.
    In Proceedings IEEE International symposium on information theory (pp 286-286)
    Conference contribution | UvA-DARE
  • Vitányi, P.M.B. (1994) Randomness.
    In Schrijver, A. Apt, K.R. Temme, N. (Eds.), From Universal morphisms to megabytes: a Baayen space Odyssey. CWI.
    Chapter | UvA-DARE
  • Vitányi, P.M.B. (1994) Multiprocessor architectures and physical law.
    In Matzke (Eds.), Proc. 2nd IEEE Workshop on Physics and Computation, PhysComp'94 (pp 24-29). IEEE.
    Chapter | UvA-DARE
  • Vitányi, P.M.B. (1995) Physics and the New Computation.
    In Proc. 20th Int. Symp. Math. Foundations of Computer Science, MFCS'95 (pp 106-128) (Lecture Notes in Computer Science). Springer-Verlag.
    Chapter | UvA-DARE
  • Vitányi, P.M.B. (1995) Computational learning theory: second European conference, EuroCOLT '95, Barcelona, Spain, March 13-15, 1995: proceedings.
    Lecture notes in computer science. Springer.
  • Vitányi, P.M.B. (1996) Genetic fitness optimization using rapidly mixing Markov chains.
    In Proc. 7th Int'nl Workshop on Algorithmic Learning Theory (pp 67-82) (Lecture Notes in Artificial Intelligence). Springer-Verlag.
    Chapter | UvA-DARE

The data of this list is taken from the Pure database. If you find output is missing from the list, please follow the previous link to find out how to submit to Pure. In case there are mistakes in PURE, please contact