Over the last years, immense efforts have been made to apply the quantities of information theory to the electronic structure and properties of various systems. In this context, one can make use of one or many of the information theoretic quantities together to describe the total energy, its components, and other electronic properties. Such an idea is feasible through an approach so-called information functional theory, which in turn constitutes the cornerstone of the present investigation. More specifically, in this work several information theoretic quantities like Fisher information, Shannon entropy, Onicescu information energy, and Ghosh Berkowitz Parr entropy with the two representations of electron density and shape function are considered for reliable prediction of atomic and molecular correlation energies as well as several electronic properties such as atomization energies, electron affinities, and ionization potentials. It is shown that with more or less different accountabilities of the information theoretic quantities they can be introduced as useful descriptors for estimation of electron correlation energies for a large variety of systems including neutral atoms, cations, isoelectronic series, and molecules. This is also indeed the case for the electronic properties under study. Considering different notions of the information theoretic quantities with various scaling properties and varied physiochemical meanings about the electron density distribution, we find that instead of simulating all data using one of these quantities individually taking all of them together provides a better view for the description of correlation effects and electronic properties of systems.