Vacature Big Data Architect – Amsterdam
HBO of hoger
O assessing, defining, implementing and/or supporting data intensive / big data architectures for our clients.
O be part of the leading group of devoteams’ architects with a deep technical background and willing to take an extra step forward to pioneer with future data intensive architectures.
O propose innovative technology solutions and take the lead in implementing and evaluating these solutions. Assist in refining and enhancing big data architectures and frameworks.
O working closely with our sale executives selling big data consulting work. Drive technical and architecture workshops.
O capable to clearly and confidently present thema matter kenner visions on current and future data intensive architecture related topics.
O supervise architecture vormgeving, infrastructure vormgeving, build and deployment activities on multiple platforms.
Master´s degree in rekentuig science, programmatuur engineering, electronic engineering or a similar field.
3+ years of experience in assessing, architecting, designing or implementing it architectures and applications related to big data technologies such as hadoop, map reduce, hdfs, hive, r, cassandra.
3+ years of experience in doing architectures for business intelligence and data analytics platforms.
Strong drive to continuously develop yourself in the field of big data.
Ability in putting together an architecture roadmap for large analytics initiatives detailing out the development, execution and operations architecture.
Have experience in working directly with business users to gather and document technical requirements.
Have deep experience in the use of modeling techniques including use case, draaiboek modeling, prototyping, and voorstelling modeling.
Experience with it delivery life-cycle programma and management methodologies and tools.
Technology experience ( at least 3 of the following skills ) :
Hands-on experience with mpp systems ( e. G. Hadoop, teradata, greenplum, netezza, azure sql datawarehouse etc. ).
Hands-on software engineering experience ( e. G. Java, scala, perl/python/php ).
Hands-on experience and data modelling skills with data warehouses ( dimensional monster, snowflake monster, cdw, data vault, data lake etc. )
Hands-on experience with data integration / data management / etl tools ( e. G. Informatica, abinitio, datastage, talend, pentaho, nifi, kafka connect, ssis, etc. ), bi tools and reporting software ( e. G. Microstrategy, qlik, cognos, pentaho ).
Knowledge of nosql platforms ( e. G. Key-value stores, document stores, graph databases, rdf triple stores ).
Practical experience in design and implementation of batch oriented and real time architectures ( e. G. Using kafka, azure iot/event hubs, spark, storm, azure stream analytics, etc. ).
Expertise and knowledge of cloud computing, cost and performance optimization on the cloud ( e. G. Aws, gcp, azure ) – iaas, paas, saas, faas.
Practical experience in architecture design of data platforms and data intensive applications – cloud, hybrid, on-premise, portable/vendor agnostic etc.
What we slachtoffer
Unique culture where people speak frankly, have great mutual respect and are united by a genuine interest in technology. Team spirit and international environment. Great salary package with toeslag and the option for a commuting allowance or lease car. Career specific trainings and certifications. Salarisinformatie traktement in samenkomst