Datamarts might sound like a solution to all the headaches on segregated data sources finally organised in one accessible dataset however I will argue here it might not be the solution you are looking for.
PowerBi datamarts are a completely new feature in PowerBi platform that has been announced earlier this year and provide an alternative to dataflows and promoted datasets PowerBi analysts got well accustomed with.
They aren't a replacement of existing methods of organising and presenting data to analysts however a welcome addition that might appeal to some analysts.
How are datamarts different from datasets?
One could argue both are fairly similar. Datamarts are a combination of dataflows, that manage all the transformations form sources like your application databases, a storage based on Azure managed SQL where data is temporarily stored and lastly a dataset which presents the data to PowerBi for report building.
The layer that differentiate one form another is most likely the temporary Azure SQL storage. Not only it allows to build and store a datasets combining data of size out to 100GB but also allows to query this data from other tools than PowerBi using T-SQL.
Why datamarts might not be the solution to your problem?
PowerBi datamarts are frequently brought up in a conversations as an alternative to a data warehouse. With little management required and built into Power Platform stack it is a rather appealing concept to many organisations without data engineering teams available in-house however there are a few limitations that disqualify PowerBi Datamarts from taking over workloads reserved for data warehouses.
No support for historical data.
PowerBi Datamarts do not preserve historical data meaning trend analysis, auditing or data snapshotting will not be possible. Something usually completed by the means of slowly changing dimensions in data warehouse will not achievable in PowerBi datamarts. Moreover data is reloaded on each refresh and therefore schema change like adding a new column or a table.