Senior Python, PySpark, Scala Developer-Location Open
Location: Gilbert, Arizona
Internal Number: 15631807
Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feels and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.
Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...
Function as an integrator between business needs and technology solutions, helping to create technology solutions to meet clients' requirements. Be responsible for developing and
testing solutions that aligns with clients' systems strategy, requirements, and design as well as supporting system implementation. Manage data pipeline process starting from acquisition to ingestion, storage, and provisioning of data to point-of-impact by modernizing and enabling new capabilities. Facilitate Data Integration on traditional and Hadoop environments by assessing client's enterprise IT environments. Guide clients to the future IT environment state to support meeting their long-term business goals. Enhance business drivers through enterprise-scale applications that enable visualization, consumption and monetization of both structured and unstructured data.
From our centers, we work with Deloitte consultants to design, develop and build solutions to help clients reimagine, reshape and rewire the competitive fabric of entire industries. Our centers house a multitude of specialists, ranging from systems designers, architects and integrators, to creative digital experts, to cyber risk and human capital professionals. All work together on diverse projects from advanced preconfigured solutions and methodologies, to brand-building and campaign management.
We are a unique blend of skills and experiences, yet we underline the value of each individual, providing customized career paths, fostering innovation and knowledge development with a focus on quality. The US Delivery Center supports a collaborative team culture where we work and live close to home with limited travel.
â¢ Bachelor of Science in Computer Science, Engineering, or MIS, or equivalent experience
â¢ 6+ years of Hadoop (Cloudera distribution) experience
â¢ 6+ years of experience in Spark with Scala or Python programming
â¢ 6+ years of experience with Hive Tuning, Bucketing, Partitioning, UDF and UDAF
â¢ 6+ years of NOSQL Data Base such as HBase, MongoDB or Cassandra experience
â¢ + years of experience and knowledge working in Kafka, Spark streaming, Sqoop, 6Oozie, Airflow, Control-M, Presto, No SQL, SQL
â¢ 6+ years knowledge of working in financial/insurance domain experience
â¢ 6+ years of strong technical skills including understanding of software development principles
â¢ 6+ years of hands-on programming experience
â¢ Must live a commutable distance to one of the following cities: Atlanta, GA; Austin, TX; Boston, MA; Charlotte, NC; Chicago, IL; Cincinnati, OH; Cleveland, OH; Dallas, TX; Detroit, MI; Gilbert, AZ; Houston, TX; Indianapolis, IN; Kansas City, MO; Lake Mary, FL; Los Angeles, CA; Mechanicsburg, PA; Miami, FL; McLean, VA; Minneapolis, MN; Nashville, TN; Orange County, CA; Philadelphia, PA; Phoenix, AZ; Pittsburgh, PA; Rosslyn, VA; Sacramento, CA; St. Louis, MO; San Diego, CA; Seattle, WA; Tallahassee, FL; Tampa, FL; or be willing to relocate to one of the following USDC locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA.
Limited Immigration sponsorship may be available.
â¢ Ability to travel up to 15% (While 15% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice.)
â¢ 6+ years of experience working with Big Data eco-system including tools such as Map Reduce, Sqoop, HBase, Hive and Impala
â¢ Expert level usage with Jenkins and GitHub
â¢ Proficiency in one or more modern programming languages like Python or Scala
â¢ Experience on data lakes and datahub implementation
â¢ Knowledge on AWS or Azure platforms
â¢ Knowledgeable in techniques for designing Hadoop-based file layout optimized to meet business needs
â¢ Able to translate business requirements into logical and physical file structure design
â¢ Ability to build and test solution in agile delivery manner
â¢ Ability to articulate reasons behind the design choices being made
At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help sharpen skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits.Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives. Deloitte is led by a purpose: to make an impact that matters. This purpose... defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you’re applying to.