|Job Type:||Full Time|
This role offers you the unique opportunity to work at the interface of research and product to develop novel privacy preserving machine learning techniques and apply them in practice.
We are a small team in Office 365. We closely collaborate with researchers and scenario owners to investigate, how state-of-the art research in areas such as differential privacy, model attacks and exposure metrics can be effectively applied to real word data and scenarios across Office 365.
Office 365 is the largest collaboration service in the world with 100s of millions of consumer/enterprise mailboxes, documents and conversations, it represents the world’s largest platform of human collaboration for personal, business and educational use. We are a massively distributed cloud service with O(exabyte) data handled by O(300K) servers in O(300) data-centers around the entire globe.
We are looking for an applied scientist with experience in one or more areas of privacy preserving machine learning, who can help to create a framework to quantify and mitigate privacy risk in order to guarantee our customer’s privacy for the next generation of machine learning systems. Challenges include incorporating approaches such as differential privacy and multi-party computation within Office 365 systems, designing machine learning systems on encrypted data and/or in decentralized or federated environments, and advancing the efficient frontier of privacy and utility in our product offerings without comprising user’s trust.
Location will be Redmond, US or Cambridge, UK.
- Apply state-of-the art privacy preserving machine learning research to production scenarios with real-world data
- Interact with researchers, applied scientists and scenario owners
- Investigate realistic threat models when shipping ML scenarios
- Develop metrics to quantify risk and exposure
- Develop mitigation techniques and tools which to choose the privacy/utility trade-off
- There will be the opportunity to publish, and contribute to scientific conferences
- PhD/MS with relevant work experience
- Strong knowledge in privacy preserving machine learning techniques such as differential privacy, model attacks, exposure metrics, …
- Good experience with natural language processing and strong understanding of language representations for NLU/NLG
- Broad and solid understanding of common statistical and machine learning techniques, both classical machine learning and deep learning
- Strong analytical, applied research, and communication skill
- Experience with working on big data pipelines and cloud solutions.
- Ability to work independently and in a team, take initiative and lead engagements as required
Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request via the Accommodation request form.
Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.