(Enter skills, job title, etc.)

Big Data/Artificial Intelligence Data Tech Lead #252182

Job Title:

Big Data/Artificial Intelligence Data Tech Lead #252182


New York, NY


Information Technology

Job Description:

Area Overview:

The Technology Group (TG) is responsible for the strategic planning and provisioning of technology services to the largest  bank in the  System. These include applications development, data architecture, network, communications and data center infrastructure and operations, project management, technology vendor management, and overall information technology and information security. The group also provides national information security, incident response, national remote access and enterprise search services for the system

Solution Delivery manages a portfolio of more than 150 diverse applications, systems, platforms and integrated COTS products ranging from mission critical for the System to tactical for specific business line units. The function implements and delivers according to the Client’s needs using appropriate tools and technologies that include commercial off the shelf (COTS) utilities and applications, building in-house, working with vendors or combining all of these methods to ensure optimum performance and customer satisfaction.

A manager/team lead in Data Services has responsibility for the technically leading a sub-set of a competency center’s responsibilities focused on Common Services.

Competency centers are accountable to the Delivery teams to provide supplemental resources to respond to demand changes and specialized skills, tools, and services to support all the Delivery teams as well as other TSG Functions, the Bank, etc.

 Common Services focus on providing multiple services, including:

• Common Software / Reuse – Develop and maintain common components, micro-services, and frameworks across multiple technologies (e.g., Java, data)

• Tooling – Support tools across multiple disciplines (e.g., developer, data, testing, PM)

    Foundational Services – SharePoint, UI, data services (e.g., ETL, data masking)

• Engineering, Platforms and COTS Services – Standing-up first time and providing ongoing expertise (e.g., Appian, SiteCore, Hadoop big data solutions , Natural language processing) and responsible for resource management (including Java), delivery management, standards, quality check, training & cross-training and continuous improvement.

Job Description and Responsibilities:

Manage a team dedicated to the design, development, and delivery of a Hadoop eco-system based Big Data solution and Deep learning projects using open source libraries and Vendor products. These solutions will eventually be providing services across multiple lines of business within the System. 

Will collaborate closely with Java, Python and Data architects for design work and develop secure and scalable application solutions. This position includes the ability to manage Hadoop based systems, Artificial Intelligence (Deep learning), provide training and guidance to  developers, design and develop architecture and work with the end-users of the system.  Will take a deep dive on the architecture of existing systems to enhance the quality of the deliverables and minimize the support. Some of the other responsibilities include:

 ·         Ongoing responsibility to manage the technology debt across the inventory of data services products.

·         Responsible for evaluating new technologies, providing design and expertise in supporting open source products, and database designs and configurations.

·         Ensure proper and complete project billing and minimize non-billable activities to manage our rates and costs to be as streamlined as possible.

·         Responsible for talent management to ensure performance assessment, development plans, appraisals, succession planning, growth, and remediation as needed.

·         Responsible for delivery of highly available, high performance, mission critical applications


Job Requirements:

Required Skills:

·         10+ years of experience working with Java as hands on developer, architecting applications with 3+ years of leading a team.

·         3 to 4 years of expertise and hands on experience on installation, deployment, and administration of Hadoop on large scale cluster implementations.

·         3 to 4 years working with Hadoop ecosystem with frameworks like with Spark, HBase, Hive, Pig, Sqoop, Flume, Map Reduce and others.

·         2 to 3 years working with Artificial Intelligence (Deep learning) open source libraries such as spaCY,Word2Vec,Glove etc.

·         Experience with data warehouses including Data modeling, SQL, ETL tools, data visualization and reporting.

·         General knowledge of open source frameworks.

·         Experience working with Unix/Linux systems.

·         Working knowledge of JavaScript, JSON.

·         Experience with Web Services (REST APIs etc.)

·         Excellent interpersonal and presentation skills.

·         Solid understanding of security architecture concepts.

·         Experience with software development, the implementation of complex technology solutions and management tools such as version control, application build and deployment utilities.

·         Experience in managing a team of 8-10  (e.g. designers, modelers, developers, contractors) including defining & implementing the strategy and technology roadmap for future data technologies

Preferred Skills:

            Python experience

            Experience managing vendors and integrating solutions within enterprise applications.

            Open source graph databases such as Neo4j

            Web Security Best Practices

Education/Certifications: Bachelor's or Master's Degree in Computer Science or equivalent

Tip of the Week

Make sure your LinkedIn profile is identical or similar to your resume. Consistency is key!


View Starpoint's Top Tips.

Send Us Your Resume

Let Starpoint's expert recruiters help you land your next job.


Submit Your Resume