Deep Learning in Security—An Empirical Example in User and Entity Behavior Analytics UEBA

Share it with your friends Like

Thanks! Share it with your friends!

Close

“Apache Spark is a powerful, scalable real-time data analytics engine that is fast becoming the de facto hub for data science and big data. However, in parallel, GPU clusters are fast becoming the default way to quickly develop and train deep learning models. As data science teams and data savvy companies mature, they will need to invest in both platforms if they intend to leverage both big data and artificial intelligence for competitive advantage.

This session will cover:
– How to leverage Spark and TensorFlow for hyperparameter tuning and for deploying trained models
– DeepLearning4J, CaffeOnSpark, IBM’s SystemML and Intel’s BigDL
– Sidecar GPU cluster architecture and Spark-GPU data reading patterns
– The pros, cons and performance characteristics of various approaches

You’ll leave the session better informed about the available architectures for Spark and deep learning, and Spark with and without GPUs for deep learning. You’ll also learn about the pros and cons of deep learning software frameworks for various use cases, and discover a practical, applied methodology and technical examples for tackling big data deep learning.

Session hashtag: #SFds14″

Comments

Manoj Kulshreshth says:

V. Good application….
Can we get ppts as download…

liaochei says:

Are there any information about the Score Layer in this talk? thanks!

raj shekhar says:

Thanks for the great session.
What are the advantages of converting the network data into images (i.e protocol time series graph) and feeding those images to CNN? Because it involves the overhead of converting the data into images. Can we not directly work on network data instead of converting it into images?

Write a comment

*