...experience managing the Big Data application stack including HDFS, YARN, Spark, Hive and Hbase- A deeper understanding of all the configurations... ...applications on YARN- Experience with one or more automation tools such as Ansible, Terraform, etc- Experience working with CI/CD...
...Job Description
Mission As a Spark Backline Engineer you will help our customers to be successful with the Databricks Data Intelligence... ...guides and runbooks.
Contribute to automation and tooling programs to make daily troubleshooting efficient.
Work with the...
...years of experience and interest in Big Data technologies (Hadoop / Spark / Relational DBs)- 3+ years of experience working on projects... ...constructing relational and dimensional data models using any ETL/ELT tools (e.g. Talend, Informatica, Alteryx etc.)- Posses ETL Experience...
...Engineer or DataOps role.- 3+ years of experience working with SQL and Spark.- 2+ years of experience with Microsoft Azure Data Lake, Azure... ...having multiple sources.- Experienced in the use of standard ETL tools and techniques (e.g. SSIS or any equivalent ETL tools).- Familiar...
...1. Strong knowledge in Spark development with Scala 2. Strong knowledge in Spark Streaming - Structured DStreams
2. Good knowledge... ...and ApacheAirflow
9. Willingness to learn new technologies and tools,if required
Key Skills: Scala,Spark, Kafka, MongoDB, Elasticsearch...
...We are looking for exceptional Spark Scala engineer with 5+ yrs experience who will be responsible for:
Experience- 5+ Yrs
Location... ...pipelines using Airflow
Experience with SQL, CICD, Gradle build tool, docker
Familiar with bigdata -
rest web server integration,...
...soon you can join:
- Should have 4-8 years Experienced in Java, Spark SQL and worked in Unix/Linux Background.
- Should have Good... ...Domain. (Optional)
- Understand the Agile Process, Code Check-in Tools, Deployment tool, Scheduler Tools. (Optional)
Should have worked...
...maintain scalable data pipelines for batch processing using Apache Spark in Big Data projects.- Utilize Scala programming language to... ...the data ecosystem as needed.- Explore and implement visualization tools to provide insights into data for stakeholders.Requirements : - Bachelor...
...Must have Skills: Apache Spark
Good to Have Skills: Data Warehouse ETL Testing
Key Responsibilities:
A: The resource will write and review complex SQL statements
B: The resource will work on ETL preferably on OWB
C: The resource will work on database related...
...Design, develop, and maintain data processing pipelines using Apache Spark and Scala.
Optimize Spark jobs for performance, scalability, and reliability.
Work closely with data engineers and data scientists to implement data-driven solutions.
Develop and maintain...
...Design, develop, and maintain robust data pipelines using Apache Spark, Python, and other relevant technologies.- Implement data ingestion... ...and SQL.- Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus.- Excellent problem-solving...
...solving hard problems
•Experience with Big Data technologies (Hadoop, Java, Spark, Kafka, Hive)
•Good knowledge of Unix/Linux and SQL
•Experience with data visualization and business intelligence tools like Tableau, or other programs highly desired
•Familiar with software...
...related field- Strong knowledge of machine learning, statistical analysis, and data visualization tools and technologies- Experience with big data processing frameworks, such as Spark or Hadoop- Strong analytical and problem-solving skills, with the ability to analyze complex...
...our different stakeholders.
Responsibilities
Develop, improve, and maintain highly complex Government Pricing calculations with Spark.
Work with our internal and external clients to implement, test and report client-specific methodologies using state-of-the-art and...
...of Architecture and Design fundamentals
Knowledge of Testing tools
Knowledge of agile methodologies
Understanding of Project life... ...Professional Requirements:
Primary skills:Technology- Big Data - Data Processing- Spark,Technology- Functional Programming- Scala
...Kochi, Hyderabad, Trivandrum.Job Description :- Proficient in SQL, Spark, Scala, and AWS, with a strong command over these technologies.-... ...effectively optimizing performance.- Adept at leveraging these tools to enhance efficiency and productivity within projects.- Crucial...
...warehousing project.
Must have excellent knowledge in Apache Spark and Python programming experience
Hands-on experience with Azure... ...and operationalizing the code, knowledge of scheduling tools like Airflow, Control-M etc. is preferred
Working experience on...
...coding across one or more platforms and languages (e.g. Java, Python/Spark/SQL) as appropriate
Hands-on expertise with application design... ...from supplied specifications using agreed standards and tools, to achieve a well engineered result Proficient and Hands-on Data...
...Spark & Scala Developer Exp- 7-13 yrs
Shift- Malaysian
Full time Contract Opportunity
Fully Remote
please share resume at [HIDDEN TEXT]
Responsibilities-
Create Scala/Spark jobs for data transformation and aggregation
Produce unit tests for Spark...
...collaborate with stewards / data custodians / end users etc- Demonstrated experience of building efficient data pipelines using tools or technologies like - Spark, Cloud native tools like EMR, Azure data factory, Informatica, Abinitio etc.- Understand the need for both batch and...
...develop, and deploy scalable Big Data applications using Apache Spark.
Collaborate with data scientists and business analysts to understand... ....
Stay updated with the latest Big Data technologies and tools to drive innovation and improve performance.
Collaborate...
...insights with different stakeholders using reporting and visualizations tools.- Work on large datasets to meet functional and business... ...Airflow, Redshift.- Strong experience with SQL and Scala skills using Spark framework.- Experience with enterprise Data warehousing...
...Senior Engineer (Spark/Scala)
Position Overview
Job Title: Senior Java Developer
Location: Pune, India
Role Description... ...on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior developers...
...: Anywhere in India Education : BE, B.Tech, Any Tech GraduateMust-Have Technical Skills : including 3+ years Spark or Scala,- 2+ years of Hadoop/Big Data using tools like Hive, Spark, PySpark, Scala, and RDBMS/SQL Strongly- Preferred: GCP, including GCS (Google Cloud Storage...
...designing, developing, and maintaining large-scale data processing pipelines using Python and Spark/PySpark.- Your expertise in distributed computing frameworks and DevOps tools will be instrumental in building efficient and scalable data solutions.Responsibilities :- Design...
...IPO- Headquarter Location : Mumbai- Nature of Offering : Product & Services- Founding Year : 1991- No of Employees : 5001-10000Role : Spark DeveloperExperiences : 6+ yearsLocation : Bangalore , HyderabadPrimary Skills : spark, apache sparkKey Responsibilities :- Design, develop...
Title: Azure Databricks/Spark Engineer
Location: Greater Noida, Uttar Pradesh (Day 1 onsite)
Duration: 6+ Months (Possibility of Extension... ...in Azure Data Factory, Airflow or any other orchestration tool
Knowledge of Hadoop, HDFS, Hive and Big Data Concepts
Good to...
...crew and material for the execution of contract awarded to its clients.
Role Description
This is a full-time on-site role for a Tool Pusher. The Tool Pusher will be responsible for supervising and leading rig crew to ensure drilling and completion operations are...
...seeking a highly skilled and motivated Data Lead with expertise in Spark, Scala, Kafka, Big Data, and Batch Processing. As a Data Lead,... ...sizeable volume (Gbs) of data- Working experience of Visualization tools- Working experience of scala programming- Working experience on...
...will be responsible for Extract Transform Load (ETL), using ETL tools, data integration, data modeling, and analytical skills.- The role... ..., data analysis and data mining etc- Knowledge on Hadoop, Hive , Spark and Impala will be added advantage- Exposure to AWS Glue, Redshift...