Big Data Architect

**Big Data Architect – £560 per day – Leeds – 4 month initial contract**

**Public sector end client – An IR35 determination has not yet been made by the end client and the assumption has been made to treat this as “inscope” for recruitment purposes – this may change**

As a Big Data Architect, you will be able to demonstrate expertise in leading enterprise data architecture and design for large complex businesses. Your credible track record of designing for Microservices, external data ingestion, real-time data analytics, domain-driven master data model, event-driven enterprise data-sharing, authoritative data sources and an external API exchange will be utilized in a continuously improving, challenging and satisfying environment.

You’ll thrive creating value using large, diverse data volumes, e.g. by using Tableau, Qlik, C3, Hadoop / Spark / SQL / Mongo Atlas/ Python to create modelling features from underlying transactional data. You will be willing to roll up your sleeves periodically to co-create solutions hands-on through modern data science and general programming languages, You will be comfortable working in a cross-functional team (including UX, analysts, statisticians, engineers, product owners, security risk, etc.).

Your significant experience in design and implementation along with your excellent technical and analytical skills will be put to the test on a regularly challenging basis.

Essential Knowledge

  • Expert level data modelling, conceptual through to physical, relational, object, analytical and NoSQL.
  • Expert in domain driven design and Microservices.
  • Expert in designing analytical database models from underlying transaction data.
  • Hands own experience in Mongo DB, Hadoop, Tableau, Qlik and Spark
  • Scripting/Coding experience (e.g. Bash, Python, Perl)
  • Excellent experience with the Hadoop ecosystems (such as HDFS, YARN and/or Hive)
  • Strong experience with streaming and stream processing frameworks (such as Spark, Storm, Flink, Kafka and/or Kinesis)
  • Good knowledge of at least one of the following programming languages: Python, Scala, Go, Kotlin, Java
  • Experience with NoSQL databases (such as HBase, Cassandra and/or MongoDB)
  • Experience with public cloud-based technologies (such as Kubernetes, AWS, GCP, Azure, and/or Open stack)
  • Expert level in the creation and maintenance of enterprise data artefacts.
  • Highly proficient in at least one query language for each of the following: relational; analytical; NoSQL.
  • Proven innovator with proficient software engineering skills to prototype relational; analytical; NoSQL solutions.
  • Highly motivated with experience of selecting and working with data and data tools.
  • Experience of canonical modelling.
  • Capable of engaging senior stakeholders.
  • Capable of developing and maintaining strong working relationships within the organisation and third party suppliers.
  • Experience of developing data strategies, policies and standards.
  • Have experience of working within public sector or a comparable organisation within the last 3 years
  • TOGAF Practitioner.

Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found at

Damia Group is acting as an Employment Business in relation to this vacancy.

Job Reference: TL/BIDATA

Salary: £500 - £560 per day

Salary per: Annum

Job Duration: 6 months

Job Start Date: ASAP

Job Type:

Job Location:

Job Industry:

Apply Now

Head Office

Guildford Business Park
Building 2
18 Guildford Park Road
Guildford, GU2 8xg
+44 1483 243 555

Edinburgh Office

93, George Street
+44 7718 258 398

Lisbon Office

Av. Almirante Reis, 54, 6º piso
1150-019, Lisboa
(+351) 211 601 380

Please follow & like us :)

WhatsApp chat