We present a framework for quantifying the dynamics of information in coupled physiological systems based on the notion of conditional entropy (CondEn). First, we revisit some basic concepts of information dynamics, providing definitions of self entropy (SE), cross entropy (CE) and transfer entropy (TE) as measures of information storage and transfer in bivariate systems. We discuss also the generalization to multivariate systems, showing the importance of SE, CE and TE as relevant factors in the decomposition of the system predictive information. Then, we show how all these measures can be expressed in terms of CondEn, and devise accordingly a framework for their data-efficient estimation. The framework builds on a CondEn estimator that follows a sequential conditioning procedure whereby the conditioning vectors are formed progressively according to a criterion for CondEn minimization, and performs a compensation for the bias occurring for conditioning vectors of increasing dimension. The framework is illustrated on numerical examples showing its capability to deal with the curse of dimensionality in the multivariate computation of CondEn, and to reliably estimate SE, CE and TE in the challenging conditions of biomedical time series analysis featuring noise and small sample size. Finally, we illustrate the practical application of the presented framework to cardiovascular and neural time series, reporting some applicative examples in which SE, CE and TE are estimated to quantify the information dynamics of the underlying physiological systems.