A groundbreaking new Tabular Foundation Model (TFM), named Mitra, has been introduced by Amazon, setting new benchmarks in generalizing across diverse structured datasets. This original work demonstrates state-of-the-art performance by innovating a pretraining method that utilizes a variety of synthetic prior distributions, drawing parallels to the success of large language models. Integrated into AutoGluon, Mitra's approach, which includes causal models and tree-based methods, learns robust representations and consistently surpasses other TFMs and task-specific baselines through in-context learning. This advancement offers a more general and highly effective solution for prediction tasks and has been made open-source.