
Junior Software Developer @EJADA
Jul 2013 - current
I am currently part of the .NET Development Team.
I work with the ASP.NET framework and other Microsoft tools (e.g: SSIS); mainly writing server-side C# code, for large-scale industry projects.
Jul 2013 - current
I am currently part of the .NET Development Team.
I work with the ASP.NET framework and other Microsoft tools (e.g: SSIS); mainly writing server-side C# code, for large-scale industry projects.
Jul 2012 - Sep 2012
The Eureka Research Program was a program started in March 2012 by the Alexandria Student Branch of IEEE, also known as IEEE AlexSB.
The Student Branch organized for students to apply and be matched with research scientists across the world working in a common research field of interest related to computer science or electrical engineering. My team of two was matched with our mentor Dr. Taghrid Samak, a research scientist in the Advanced Computing for Science Department at the Lawrence Berkeley National Laboratory (LBNL).
We were mainly concerned with working on large-scale data analysis for statistical purposes using Apache Pig over Hadoop in Internet2 router logs. This included detection of cross-path correlation coefficients as well as statistical outliers.
During the project, my duties included:
-System administration duties where I had to work around the slightly tricky setting up of Hadoop and Pig over a Windows machine and Linux VM.
-Working within a team of 2 (supervised by our mentor) to test several different techniques for detection of cross-path correlation in Internet2 logs. Methods were first tested on a subset of the large data set on the team's laptops and then later tested on a 50-node cluster.
-Briefly experimenting with Hive and Cascade over Hadoop.
Aug 2011 - Sep 2011
I worked on STAMPEDE [Synthesized Tools for Archiving, Monitoring, Performance and Enhanced DEbugging] with a long distance mentor in a team of 3.
My work was on the data visualization module of STAMPEDE and included extraction of workflow and job statistics using Python scripts interfacing a sqllite database and a RabbitMQ message bus.
2005 - 2008
2005 - 2013