Intelligent Data Management involves the process
and framework for establishing a highly organized
set of patterns and methods to ensure data is
classified, cataloged, mapped and exchanged in a
very auditable fashion. The bank is subject to
regulation to ensure information is shared and used
correctly, in an effort to alleviate inconsistencies
and avert any wrong doing. In recent years,
the mortgage and related loan crisis has required
monitoring to ensure applications and its consumers
are exchanging data as defined by enterprise
standards. Furthermore, detailed cataloging
using implements such as manifests (tracks what data
is sent), accompanying every payload (batch of data
transmitted), adheres to an approved agreement (an
arrangement between parties to transmit data,
including its predefined types, layout, format and
schedule) to allevite a loosely defined query.
Approved Data Sources adopts these structures to
ensure data transmitted is authorized for
consumption to alleviate intrusions (PUSH models
increases security), and has a retained proof of
communication.

This level of design also introduces a level of
performance improvements and production level
support that allows Level I teams to reproduce
problems and recreate an outage constructively.
Logging also allows measurements to help calcuate
costs, network utilization which will appropriate
production needs such as upgrades, migrations,
transformations and forecasting.

Benefits & Features
of Intelligent Data
Management
- Traceabilty
- Lineage
- System of Record
- System of Origin
- Managed Access
- Permissible Views
- Access Review Controls
- Authorized Data
- Data Delivery
- Standards
- Data Lake Integration
- Data Retention
- Provisioning Points
|
|
My level of involvement included the detailed
task of on-boarding applications, databases to adopt
IDM as the enterprise standard. This would
require detailed data mapping, minging, research and
transformation to ensure underlying components
become compliant with prescribed standards.
Forensics on existing ETL, DMO, SSIS and integration
scripts were conducted to help devise the best
possible strategy and roadmap towards compliance.
Applications and databases are often riddled with
inconsistencies that creates expensive overhead such as
redundant, stale and highly unused data. As the history
and background of implementations get lost over time, it is not
unusual for database growth to go unmonitored and unregulated.
Consequently, storage and consumption peaks exceed thresholds
and may lead to outages. Migrations and upgrades are
natural paths to engage the following oppurtunities:
|  | Application Performance Analytics |
|  | Application Consolidation Oppurtunities |
|  | Cross Portfolio Capabilities |
|  | Host-Server Migration Oppurtunities |
|  | Retirement & Decommissioning |
Consequently, database engineering provides the means to
improve the platform to relieve technological
pressures. Removing old data which exceeds
retention requirements often means the purging or
archiving of data into less expensive media.
Dispositioning exercises exacts forencis to mine
usage patterns and establish the use, importance and
consumers of data. This helps to establish
needs and requirements to better implement uses in
target state databases. It also suggests the
need to prioritize where data can reside, especially
when it can be mapped into the following:
|  | High-Availability (DB2 or Oracle transactional databases) |
|  | Low-Cost (Teradata or hosted SQL Azure inexpensive to operate) |
|  | High Density (Archive grade Hadoop that is near-online) |
TBD
|  | SQL Improvements |
|  | Data Consolidation |
|  | SQL Migration |
|  | ODBC Deployment & Configuration |
|  | SELMA Package Design |
TBD.
|  | System Overview |
|  | Architecture Design |
|  | Project Status |
|  | Project Artifacts & Repository |
|  | On-boarding Documentation |
|  | On-boarding Guide |
|  | DB2 Table Catalog |
|  | Teradata Catalog |
|  | Migration Mapping |
TBD.