With the rise of multidimensional object learning, there has been increasing interest in tensor representations of data, which allows for the simultaneous modeling of information from multiple sources. In previous studies, the tensor representations of the data are largely restricted to treating the observed tensors as independent objects and have not explicitly accounted for potential temporal dependency among the tensors. This assumption substantially limits the applicability of tensor representations, since most multidimensional time-dependent processes, from crypto-assets to human brain signals, exhibit various types of serial dependency. Neglecting this dependence structure typically results in inflated false-positive rates, especially when the observed data samples are of moderate size, and also leads to underestimation of model uncertainty, which profoundly affects all aspects of risk analytics. To address this fundamental restriction, we propose a novel framework, named the WeDTLasso, for estimating the concentration matrix of a sequence of tensor-valued data, which allows us to explicitly and systematically account for the dependency between multiple information sources over time. We derive theoretical guarantees specifying the near-oracle error bound for WeDTLasso and assess its performance on simulated data, exhibiting various types of serial dependency. Furthermore, we illustrate utility of WeDTLasso in application to the portfolio construction for crypto-assets. We find that based on the Matthews Correlation Coefficient, WeDTLasso outperforms other state-of-the-art approaches with relative gains of up to 20%. Moreover, our findings suggest that our WeDTLasso estimator yields up to 50% increase in the Sharpe Ratio for portfolio selection performance, compared to state-of-the-art algorithms for portfolio analytics.