First, C. E. Shannon introduced Shannon's entropy, an entropy measure in communication theory. Thismeasure is logarithmic in nature. In order to quantify information uncertainty, various academics developed newlogarithmic and exponential entropy metrics after Shannon. In this study, a novel probabilistic entropy measure thatefficiently measures complexity and uncertainty in complex systems is proposed using the quadratic equation. Thesenovel probabilistic entropy Metrics have a big impact on how we understand complicated systems and how we makedecisions in many fields. Several established entropy axioms have been used to verify the validity of the newprobabilistic entropy measure. The findings show that quadratic entropy metrics perform better than current ones incapturing minute variations in system uncertainty and behavior. In this paper, we discuss some properties of thismeasure.Keywords: Entropy, fuzzy set, uncertainty measure, quadratic function, information measure