Responsibilities:Ensure scoping is correctPlanning and architecture definition and implementationReporting progress
Skills:PythonPySparkApache SparkDatabricksBigData experience is required (Spark, Dataproc, etc.)Cloud certification is required AWS/Azure/GCP)
RequirementsAt least 10 years of experience with DataB2 or Native English LevelActively working
How to apply:To streamline our application process and ensure a fair assessment for all candidates, we request all applicants to apply through LinkedIn with the Apply button.
About Worky:Worky is an on-demand talent cloud platform that seamlessly connects companies with remote and vetted professionals globally. Positioned to redefine the landscape of the on-demand talent industry, Worky stands as the foremost provider of vetted talent in data, software engineering, and product development. Our esteemed clientele relies on us to access top-tier talent for remote positions, and Worky's rapidly expanding global talent network empowers both established organizations and startups to meet their diverse hiring needs at any scale.Central to our operations is Worky's sourcing process, which revolutionizes the talent vetting and matching process, delivering engaged and qualified professionals with unprecedented speed.The key advantages of partnering with Worky include our overarching mission, a vast global network, a commitment to guaranteed quality, cutting-edge technology integration, and a hyper-personalized and proactive approach to talent acquisition.To explore how Worky can revolutionize your approach to talent acquisition, please visit us at https://www.worky.tech