Data Engineer – Pakistan

Become part of PureVPN’s commitment to revolutionizing security and freedom in a digital age

Data Engineer

About us:

At PureSquare, we strive to deliver more than a VPN to our users. We are a complete cybersecurity solution that offers unrestricted access to the internet without worrying about attacks from hackers and phishers. PureSquare started off back in May 2006 as a passion project. Since then, it has grown into a colossal brand that helps its users freely access information and content globally. Since its inception, PureSquare’s vision has always been about quality, freedom, and choice of connectivity underpinning most of the countries where you can connect without any kind of limitations.

With PureSquare you can get instant access to 6,500+ fastest VPN Servers across 96+ locations worldwide. Every country that we choose as our VPN location has its own specialty and empowers our users even more. Now with the renewed focus, we are powering safety and consent at every level and layer of digital security. We are going beyond a simple VPN product by working towards creating a lasting and transformative impact for a safer and more equal world than the one we know so that you can be online. #BeYouOnline #PrivateByConsent

The Role:

Have you ever thought of joining a hyper growth startup that’s redefining its industry? Look no further! We are one of the fastest growing companies in our sector, using innovative technology and solving the hardest problems to be the one stop shop for all your online security needs.

As a Data Engineer, you will be part of our growing team helping us to rapidly evolve our Data Platform to supply the increasingly data-led needs of the business as we expand into new products and new markets. Have a strong demonstrated understanding of dimensional modeling and similar data warehousing techniques as well as embracing technologies like real-time processing and universal event logging, you will bring a rich blend of technical strength, engineering excellence, business acumen and the ability to operate in a very fast-moving environment to deliver value quickly and in an agile fashion.

For the right person, this role represents a huge opportunity to shape the data capabilities of a fast-growing, digital-native business, and catapult your career to the next level.

What you will be doing:

  • Design it : building and managing a highly robust and scalable Data Warehouse/ETL/Event Driven infrastructure and a scalable data pipeline, build and launch new data models and data marts that provide intuitive analytics to the org.
  • Run it: ensure we build fault-tolerant systems and processes, where data integrity is king and underpinned by automated data quality monitoring and alerting
  • Evolve it: constantly look for things to improve, whether fixing recurring problems, delivering small but helpful features, or optimizing for performance and scalability.
  • Secure it: make sure privacy and data security are first class citizens
  • Document Everything : make sure we have top class and up to date documentation for our entire Data Platform stack.
  • Partnering: effectively with both business stakeholders and product engineering to deliver high-value BI assets
  • Collaborate: work with key stakeholders and partners to understand and shape requirements and to drive the roadmap for our Data Platform.

What you need for this role:

  • 5+ years of experience in a data engineering or similar role
  • Strong Database Query skills using Postgres.
  • Strong programming experience [Python Preferred]
  • Experience with messaging systems (such as RabbitMQ or Kafka)
  • Hands on experience working with BI tools, like Power BI, Tableau
  • Experience working with cloud platform (such as AWS, Google Cloud)
  • Experience working with public data APIs (such as Google Analytics API, Slack , Facebook API, Twitter API)
  • Deep understanding of core OLAP and data warehousing concepts and practices
  • Experience with data privacy issues and management
  • Experience with active management of data quality (including monitoring and alerting)
  • Good understanding of the SDLC & CI/CD Pipelines
  • Familiarity with search/classification algorithm is a plus

Application Form