Project Overview:
T&T is in search of a talented Data Architect and Data Modeler to join our team in developing a scalable data model for the National Fisheries Logbook. This pivotal project will span multiple Regional Offices, creating a unified data and validation model to support various headless applications for vessel logging.
The candidate will work closely with Regional Offices to design and implement a centralized data store, develop scalable APIs for seamless integration with React applications and PWAs, drive the standardization and efficiency of vessel log tracking by consolidating data into a national database, and enhance reporting capabilities across client Regional Offices.
Objective: Transition from isolated vessel log tracking systems to a centralized national database to improve standardization, reporting, and efficiency across our client’s Regional Offices. This is one of the first pilot programs of its kind to take a holistic platform and data model approach. The positions responsibilities and growth may expand to incorporate other application types moving forward.
Primary Responsibilities:
Exhibit strong leadership in modern database design, architectural principles, and technological modernization.
Assist in scaling a common validation model that can be utilized for the applications
Enhance existing database and storage design for optimal efficiency.
Reengineer database architecture and system design, introducing innovative technologies.
Implement API architecture for seamless data communication within applications.
Create detailed documentation and diagrams.
Collaborate effectively with the application team, infrastructure team, leadership, and other contractors.
Develop holistic database solutions rather than temporary fixes.
Prototype new technology solutions in the cloud and other modern databases, tools, and designs.
Participate in defining target state applications and data technology architecture roadmaps.
Contribute to T&T proposal activities and other duties as assigned by T&T managers/supervisors.
Required Skills & Qualifications:
BA or BS degree
5+ years of experience building, architecting, designing, and modernizing legacy database systems and building new target state architecture
Organizational experience in an agile operating model
Established work in proof of concepts and technology.
The ability to architect and model mission critical RDMS, BI, OLTP, OLAP, and solutions leveraging multiple DBMS technologies is required; Ability to create new schemas on how production databases run, tune, and maintain in a production environment
Program systems, scripts, and queries in conjunction with SQL Server, Oracle databases, NoSQL databases (Like MongoDB), Snowflake, on both on-prem and cloud environments with DB such as. Amazon RDS, AWS Glue, PostgreSQL, MySQL, DynamoDB, Elasticsearch; Microsoft Azure cloud data services (or equivalent) utilizing Functions, Data Factory, Managed Instances
Extensive experience with Data Analysis, Data Profiling, Data Flows, and Data Modeling is required
Experience in a Service Oriented Architecture
Desired Skills & Qualifications:
Python
Experience programming and requesting information from RESTful endpoints
Experience architecting big data solutions using Data Lake, Data Modeling , Data Products
Experience with MongoDB, Tubleau, Apache Cassandra, Snowflakes
Experience with Cloud native DB design
Expertise in architecting and developing database solutions
Experience with No SQL and GraphQL technologies
Experience programming in an object-oriented language
Design ERDs and produce high quality engineering design documents
Willingness and ability to apply complex business logic to programs, scripts, and queries that interface with large databases
Provide rational analysis of data and means to improve structure and integrity.
Benefits: Competitive benefits package including: health, dental, vision, life insurance coverage, 401(k) Plan, Training Programs, Accrued Paid Time Off (PTO) and Paid Holidays.
Equal Opportunity Employer/Veterans/Disabled