Dataphin adheres to Ralph Kimball's dimensional modeling theory, allowing users to design and create conceptual models tailored to specific business conditions. These models facilitate the creation of dimension tables, fact tables, atomic metrics, business filters, metrics, and logical aggregate tables that correspond to business entities, which include business objects and business activities.
Prerequisites
The intelligent R&D module must be activated for the current tenant to utilize data standardization and modeling. For detailed instructions, see Tenant Management.
The data standardization and modeling feature is only supported by projects that are correctly bound to the data section. If there is an incorrect binding or if the data section is not bound at all, this feature will be unavailable. For guidance on how to bind projects to data sections, see Create a General Project.
The scope of features available for data standardization and modeling within a project is established by the default feature menu items chosen during the project's creation. For detailed information on data standardization, modeling, and the associated default feature menu items, see Create a General Project.
NoteBasic projects linked to Basic Mode data sections are compatible with data standardization and modeling. However, if Basic projects are linked to Dev-Prod Mode data sections, they can only use data processing and ad hoc query features.
Feature overview
You can create various logical dimension tables, logical fact tables, atomic metrics, business filters, metrics, and logical aggregate tables based on specific needs. The following table outlines each modeling feature:
Feature item | Description | References |
Logical dimension table | Logical dimension tables capture the detailed attribute information of business objects. You can design and create these tables to correspond with the specific type of business object. | |
Logical fact table | Logical fact tables capture data generated by business activities. Depending on the business activity type, corresponding logical fact tables can be designed and created. | |
Atomic metric | Metric Definition: This represents the statistical scope and specific algorithm associated with a metric. Dataphin introduces the innovative concept of development automation, which includes clear metric definitions that encompass the design's statistical scope-namely, the calculation logic. This approach enhances R&D efficiency and guarantees the consistency of statistical outcomes. An example of such a metric is the payment amount. | |
Composite metric | A formula for deriving multivariate calculations from atomic metrics. For instance, using atomic metrics A and B, one can create the composite metric C=A/B. | |
Derived metric | It utilizes atomic metrics, periods, and dimensions to define the business statistical scope and perform analysis, thereby deriving the values of business statistical metrics. Derived metric = Atomic metric + Business filter + Statistical period + Dimension(s) (statistic granularity). | |
Business filter | Business scope of statistics serves to filter records adhering to business rules, akin to SQL 'where' conditions, excluding time intervals. Atomic metrics represent standardized definitions of computational logic, whereas business filters define standardized condition constraints. | |
Logical aggregate table | Derived metrics are associated with a single logical aggregate table, which possesses a unique statistic granularity. Derived metrics sharing the same statistic granularity and statistical period are grouped into the same logical aggregate table. The primary key of this table is formed by the combination of dimensions from logical dimension tables, which constitute the statistic granularity. Metrics comprise all fields in the table that are not part of the primary key. |
Data standardization and modeling entry
Navigate to the Dataphin home page and click the R&D option on the top menu bar.
The Data Development page will display the left-side navigation pane with entry points for each data standardization and modeling module.