JOHN COSMAS
Program Manager

   
 

 


Project Highlights:

Term: 18 months
Max Budget: 4 million
Data/Platform Migration
Resources: 4
Card Analytics
Methods: RAD, Agile, DAIC, AWE
Core Applications
 
Card Technologies
Data Migration
Data Dispositioning
Data Warehousing
Application Onboarding
BI Reporting
Solutions Architecture
Data Management
Appfluent Profiling
Data Mining
Configuration Management

 

Features Technologies:
 

Windows 2014 Server
Windows 2014 SQL Server
Microsoft Windows 7
Microsoft Access & Excel
Microsoft Office Developer
Oracle Virtual Machine
IBM DB2 9.7 AIX
IBM DB2 11.2 LUW
ODBC DB2 9.7, 10.5, 11.2
IBM Data Studio
IBM Server Manager
TOAD Data Point 4.0
SAP PowerDesigner
Teradata & Teradata Studio
SSIS, SSAS, SSRS, Cognos
SAS, SAS Grid & SAS EG
MS Application Virtualization
Powershell & SQL Scripts
Appfluent
LINQ & LINQPad
Netezza
Hadoop & Impala
Permit-to-Build/Operate
JIRA, Rally & VersionOne
AppHQ, MDH & DNT
PPRT & MS Project
MS Visio Architect


View solution screenshots:
click on any link to view screenshot...

 

 

Data Quality & Control

The DQC Group was formed within GWIM (Global Wealth & Investment Management) to provide support for a host of financial systems including the various payment, deposit & customer hubs to service card technologies for the bank.



 

The BACARDI platform is an analytical database that needed to be upgraded to 11.2 DB2 BLU on RLE to alleviate the older version which was exposed to frequent growth limits and hardware age.  I was commisioned to help migrate the platform into the new DB2 ecosystem that included an Operational Data Platform (ODP), Analytical Data Platform (ADP) & OneHadoop.

 

Simplification allowed for the need to re-organize the current system into a target state that remediated redundancies, removed stale data stores and retired a series of libraries no longer fit for us.  Dispositioning exercises enabled for the classification of data, libraries and components that will migrated to the intended platforms.

 

My portfolio included the following groups:

 

Data Quality & Control
Fraud
Mortgage
Commercial Lending
Wholesale Banking
Large Commercial
Small Business Banking
Lloyld Banking Europe

 

 

Areas of work included:

 

Enterprise System Architecture
Functional Specifications Design
Non-Functional Requirements
Application Process Modelling
UAT/Integration Test Design
Build to Operate Program (Permit-to-Build & Permit-to-Operate)
Enterprise Control Functions
Enterprise Data Management
Global Records Management standards
Global Business Recovery Control
Application Process Modelling
Permissible Use
Authorized Data Source
System of Record
Data Dispositioning
Data Aggregation & Reconciliation
Data Warehousing & Mining
Vendor Evalutions & Assessments
Database Migration
Application Retirement & Migration
Application/User On-boarding
Application Resource Provisioning
SAS Enterprise Guide
Test Planning
Data Lakes

 

The list of applications/solutions supported include:

 

Unsecured Term Loans
Vehicular Lending
Product Offering Engine
Bank Card Analytics Decisioning
Report.NET
Fraud Controls & Recovery
Card Data Hub
Customer Data Hub
Account Data Hub
Transactions Data Hub
Data Discovery & Visualization
Merchant Administration
Rewards Engine
Payment Hub
Anti-Money Laundering
Associate Monitoring
Global Data Payment Processing
Deposits Hub
C3PM
Credit Bureau Processing

 

 

BACARDI Legacy 9.7 to 11.1 migration.

Teradata Operational Data Platform

Teradata Analytics Data Platform

OneHadoop

 

 

TBD.

 

 

 

 Data Requirements
 User Accounts
 Permissible Views
 Data Tier Access
 User Group Controls
 Data Retention
 Application Connectivity
 ODBC/SAS Configuration
 LINQ Improvements
 Schema Mapping
 Day ZERO Loading
 SLA/Schedules
 Downstream Data Provisioning
 Data Provisioning & Delivery

 

 

Intelligent Data Management involves the process and framework for establishing a highly organized set of patterns and methods to ensure data is classified, cataloged, mapped and exchanged in a very auditable fashion.  The bank is subject to regulation to ensure information is shared and used correctly, in an effort to alleviate inconsistencies and avert any wrong doing.  In recent years, the mortgage and related loan crisis has required monitoring to ensure applications and its consumers are exchanging data as defined by enterprise standards.  Furthermore, detailed cataloging using implements such as manifests (tracks what data is sent), accompanying every payload (batch of data transmitted), adheres to an approved agreement (an arrangement between parties to transmit data, including its predefined types, layout, format and schedule) to allevite a loosely defined query.  Approved Data Sources adopts these structures to ensure data transmitted is authorized for consumption to alleviate intrusions (PUSH models increases security), and has a retained proof of communication. 

 

This level of design also introduces a level of performance improvements and production level support that allows Level I teams to reproduce problems and recreate an outage constructively.  Logging also allows measurements to help calcuate costs, network utilization which will appropriate production needs such as upgrades, migrations, transformations and forecasting.

 

Benefits & Features

of Intelligent Data

Management

 

  • Traceabilty
  • Lineage
  • System of Record
  • System of Origin
  • Managed Access
  • Permissible Views
  • Access Review Controls
  • Authorized Data
  • Data Delivery
  • Standards
  • Data Lake Integration
  • Data Retention
  • Provisioning Points

 

My level of involvement included the detailed task of on-boarding applications, databases to adopt IDM as the enterprise standard.  This would require detailed data mapping, minging, research and transformation to ensure underlying components become compliant with prescribed standards.  Forensics on existing ETL, DMO, SSIS and integration scripts were conducted to help devise the best possible strategy and roadmap towards compliance.

 

 

Applications and databases are often riddled with inconsistencies that creates expensive overhead such as redundant, stale and highly unused data.  As the history and background of implementations get lost over time, it is not unusual for database growth to go unmonitored and unregulated. Consequently, storage and consumption peaks exceed thresholds and may lead to outages.  Migrations and upgrades are natural paths to engage the following oppurtunities:

 

 Application Performance Analytics
 Application Consolidation Oppurtunities
 Cross Portfolio Capabilities
 Host-Server Migration Oppurtunities
 Retirement & Decommissioning

 

Consequently, database engineering provides the means to improve the platform to relieve technological pressures.  Removing old data which exceeds retention requirements often means the purging or archiving of data into less expensive media. Dispositioning exercises exacts forencis to mine usage patterns and establish the use, importance and consumers of data.  This helps to establish needs and requirements to better implement uses in target state databases.  It also suggests the need to prioritize where data can reside, especially when it can be mapped into the following:

 

 High-Availability (DB2 or Oracle transactional databases)
 Low-Cost (Teradata or hosted SQL Azure inexpensive to operate)
 High Density (Archive grade Hadoop that is near-online)

 

 

TBD

 

 SQL Improvements
 Data Consolidation
 SQL Migration
 ODBC Deployment & Configuration
 SELMA Package Design

 

 

 

TBD.

 

 

 

 

 

 System Overview
 Architecture Design
 Project Status
 Project Artifacts & Repository
 On-boarding Documentation
 On-boarding Guide
 DB2 Table Catalog
 Teradata Catalog
 Migration Mapping

 

 

 

TBD.