Software design and programming - object-oriented and functional programming paradigms with Scala.
Object-oriented concepts, designing object-oriented software using a proven methodology and tools.
Internet and web technologies, network protocols, present and manipulate information on the World Wide Web.
Cloud Computing - Big Data, Hadoop and MapReduce algoritms.
Mobile and Android programming.
Play2 project written in Scala. The project uses MySql as a database, Slick3 for persistence, Deadbolt for security, authorization is hand made. Back-end uses Json for data compositions, a WS library for connections to the data server. Front-end interface has a Mobile and Web version. Web page uses Bootstrap, and JQuery for data access. The mobile version is designed with the Cordova cross platform tool (JS, HTML), uses JQuery and JQuery Mobile.
This application is created to find the real time bus arrival times in London. Trough simple Bootstrap interface by entering the Bus route number and selecting desired bus stop, the application is returning bus arrival time predictions.
The development of this application has a purely academic matter. The main aim was to build a reactive web application by using a Play framework with Scala. As a source of data has been used Transport For London (TFL) open data feed. The base application has to connect the TFL produced open data stream which would be established via Iterator and Enumerator classes, latter periodically requested via web sockets.
Unfortunately TFL has refused access to the real time data stream, therefore the server side application is updating data with WS library and preparing it with Jason.
The client side web application is using asynchronous requesxts with JQuery, and styled with Bootstrap. It has a login feature to personalize searches. Security is established with a Deadbolt security plugin, data persistence for MySql is using Slick3 and all data access interface (login, register, access control etc.) is hand written.
Client side mobile application is made by using Phonegap Cross Platform tool and styled with JQuery Mobile.
The overall aim is to endeavour project with Play2 by using Scala as a programming language. System would use a framework builds and Scala functional advantages.
Other smaller pieces of code written for Play2, including Slick2 and Slick3 samples, Principles Of Reactive Programming at COURSERA course materials, improvised tutorial samples from books: Manuel Bernhardt "Reactive Web Applications", Peter Hilton, Erik Baker, Francisco Canedo "Play for Scala" and Joshua Sureth "Scala in Depth" along with other code in "Twitter-streams", "AppWorkspace", "PlayAppWorkspace" and "scala" repositories.
Implement a fully functional HTTP API for a restaurant stock management system.
Each item restaurant stocks has a name, price and category. We also keep track of the items quantity as soon as they get used into the restaurant, so we know in advance when we're running out of one of them.
Implement the following endpoints:
1) list of all available items of a given category
2) list of all the items running out of stock (assume 10 items as a threshold)
3) add a new item
4) update the item quantity (example: when the restaurant uses an item, we want to reduce its stock availability counter)
5) remove an item (meaning that the item will not be bought again, so we need to take it out of our inventory
The database can be mocked in memory as preferred, so there's not need to deal with drivers, connections and so on.
The application should be production grade (mocked database apart of course) and it should be possible to run it from command line. It’s fine to assume that we already have building tools installed (java, maven, sbt…).
It would be much appreciated if you use Scala but there are no language/tool restrictions, use whatever you find more appropriate.
$ git clone https://github.com/rcerka01/AKKA-HTTP-restaurant-management-api
$ cd AKKA-HTTP-restaurant-management-api
$ sbt run
$ sbt test
$ curl -X GET http://localhost:8080/all-items
$ curl -X GET http://localhost:8080/running-out-of-stock/food
$ curl -X POST localhost:8080/add-item?name=newitem\&price=1.3\&category=food\&quantity=10
$ curl -X PUT localhost:8080/update-item-quantity?name=cheese\&quantity=1007
$ curl -X DELETE localhost:8080/remove-item?name=spoon
You're given the task of writing a simple console version of a drawing program. The functionality of the program is quite limited but this might change in the future. The program should work as follows:
1. create a new canvas.
2. start drawing on the canvas by issuing various commands.
The program should support the following commands:
C w h Should create a new canvas of width w and height h.
L x1 y1 x2 y2 Should create a new line from (x1,y1) to (x2,y2). Currently only horizontal or vertical lines are supported. Horizontal and vertical lines will be drawn using the 'x' character.
R x1 y1 x2 y2 Should create a new rectangle, whose upper left corner is (x1,y1) and lower right corner is (x2,y2). Horizontal and vertical lines will be drawn using the 'x' character.
B x y c Should fill the entire area connected to (x,y) with colour 'c'. The behaviour of this is the same as that of the "bucket fill" tool in paint programs.
Q Should quit the program.
$ git clone https://github.com/rcerka01/ConsoleVersionOfDrawingProgram.git
$ cd ConsoleVersionOfDrawingProgram
$ sbt run
$ sbt test
More thorough description on GitHub..
Pure SCALA code samples. The first example has a Bitbucket repository where are small code samples with collections (list, map, filter, zip, foldLeft, foldRight etc.). Carrying, pattern matching, trait composition, recursion and tail recursion, type classes, implicit values. option, streams, lazy, futures / prommises, etc. The second example is a project which is developed with AKKA actors.
Here is a collection of small code samples which I made to refresh my memory in different aspects of programming SCALA. Some parts are simplified to reference only one particular purpose for the task, some are more sophisticated solutions including exercises from Martin Odersky course "Functional Programming Principles in Scala". It is not reflecting the best achievement or the most elegant code ever written, however, it is representing the level of involvement and understanding in each of the listed categories.
The task was by using AKKA library to rewrite an existing code and introduce parallel computation via actor objects. The objective of this work was to use an existing code base and implement the actors programming paradigm.
A 3D scene is represented as a set of objects. A ray tracer works by casting rays from a virtual camera onto the scene, computing intersections with objects in the scene. The rays are cast through an invisible plane (the view window) in the scene that represents the generated image. The view window can be thought of as a grid, where each square represents a pixel of the generated image. When a ray hits an object, the ray is reflected and refracted, producing secondary rays which may intersect other objects. All of these rays contribute to the final colour computed for the pixel. For instance, a ray through a given pixel might intersect with a reflective red object and the reflected ray may in turn intersect a blue object. The resulting colour of the pixel will be purple.
The input data is stored in root folder input.dat file. The output is an output.png file. The actor system is called from the Trace class. CoordinatorActor class is separated from Coordinator class just to store parameters. After receiving an initial message coordinator is starting Trace.traceImage function, which by looping is sending CoordinatorColorMsg with parameters to the coordinator. When parameter “waiting” is zero (which is picture width x hight) the system terminates.
JAVA work samples. The first is with Design Patterns, the second is JAVA 8 lambda exercises, the third example is a Java project completed as an individual coursework at the college. The last one is a sizeable project of the Battleship game finished within a group of four fellow students.
Every work sample is performed during the studies in the college. All of them are part of assessments for the modules "Object Oriented Design and Programming" and "Software Design and Programming".
Abstract factory, Adapter, Bridge, Builder, Composite, Decorator, Factory, Factory method, Prototype, Proxy and Singleton. The code is my improvisation of design patterns which is simplified to the maximum level just to explore the behaviour of each pattern. It is all my work, or in some cases transformed from existing samples.
Lambdas (anonymous functions) are one of the major JAVA advantages after the 8th version was released. Here are eight samples where lambdas are used for a variety of purposes. They include manipulations with generic list members, mend strings, play with predicates, example to return directory structure, sort lists and others.
This project is a part of the assessment for "Object Oriented Design and Patterns". The task was to write an interpreter for a simple machine language - SML. Part of the code which does reading in a program and translating it to an internal form was already provided. The task was to complete the methods in a class Instruction. This requires to add some fields, create a subclass of Instruction class for each kind of SML instruction (del, sum, mul, etc.). Switch statement that decides which type of instruction is created, was suppose to be replaced with Java reflection to change a JAVA classes at runtime. Getters, setters and constructor must be generated with Lombok code.
This work example was developed within a group of another three programmers. It is a Battleship game where each player has a fleet and an ocean (hidden from the other player), and tries to sink the other player's fleet. The fleet consists of one aircraft carrier, two battleships, two submarines, two destroyers and four patrol boats.
The task was to develop a solo version, where the computer places the ships, and the human attempts to sink them. The program consists of Ship, Ocean, BattleshipGame class and the class for each type of ship. The tests were written with JUnit, object mocking was introduced. My task included to develop Ship classes, along with the ShipFactory and tests for them. To maintain the SOLID principle for a low coupling the factory uses a list which contains classes of ship types. Initially, every ship type was supposed to be registered.
This is a project to create an online system that would let self-employed couriers to do their job independently. Advertise their position and vacancy, receive a feedbacks and trace the process of delivery. This is my MSc dissertation for Birkbeck college that consists of two parts - web application built with Spring MVC, MySql database and mobile application, built with Android.
Web based job flow organisation for self-employed couriers
The task was to create a web application for self-employed couriers and customers with an online registration system, where couriers will be able to keep accounts on previous jobs, customers - to book jobs in required area and trace the progress of delivery. In addition, customers will also have the option to leave feedback for the services, that will resolve the issue of the courier’s trust. Furthermore, customers would be able to communicate with the courier directly without using a third party (courier companies).
This system uses MySql as a database, Hibernate as as ORM data model, REST services to pass a Jason data from and to the mobile device, web flows, tiles, it is hosted on the Amazon WS.
(on the left side is a yellow "help" box to explain the options)
Amazon Web Services (AWS).
As an example, this website is currently running on EC2 Ubuntu 14.4 instance of AWS EC2 virtual server, along with the 90% of all work samples explored here.
The AWS instance, has installed the following software:
Besides EC2 instance, following AWS services are used:
Access to the EC2 Ubuntu terminal usually established with "Putty" SSH remote connection.
Two MapReduce programming samples developed with the Java Hadoop framework completed on Amazon Web Services (AWS) by using the Elastic MapReduce (EMR).
The task was to calculate probability for each word sequence from the given text. Write a MapReduce program to find the conditional probability that a word w′ occurs immediately after another word w, i.e., Pr[w′|w] = count(w, w′)/count(w) for each two-word-sequence (w,w′) in the entire corpus of Jane Austen’s works from the Gutenberg project (two-word sequences across paragraph boundaries here was ignored).
The input source:
The program is finding 10 words which are most likely to be said immediately after the word “for”, i.e., with the highest conditional probability Pr[w′|w = for]. Both the “pairs” pattern and the “stripes” pattern have been used.
Calculate Page Rank for any two number sequences from the given list. Write a MapReduce program to find the PageRank (with damping factor 0.85) score for each user in the Epinions who-trust-whom online social network (from the SNAP dataset collection).
To find out which 10 users have the highest PageRank scores in this social network.
Summary for both solutions I achieved 100 points, the highest mark possible.
Two Android applications and one Cordovaa project.
This is an Android application for "Mobile and Ubiquitous Systems" module coursework at Birkbeck College. The task was to write Android App which would initialise phone ID, position via Longitude and Latitude and start measuring the sound level volume. In the event of sound level change all data have to be fed to the cosm.com (now http://xively.com/) site. For the sound level measuring I use open source classes: Complex, FFT and Recorder. Classes MainActivity and FeedingActivity where written by me and are responsible for location detection, interface and feeding.
This application is a second part of my MSc dissertation "Web based Job Flow Organization for Self Employed Couriers”. It is an Android program for couriers to receive jobs and send the location coordinates.
The Android application consists of three main activities and four classes. The first activity serves the courier identification, the second one holds the list of active jobs. If the job is completed it disappears from the list. Jobs are in different colours which represent their statuses, red - for new job, green for approved job, yellow - after collection is done. Every list item (job) is clickable, and it opens the third activity where all job details are presented. At the bottom of the third activity is a button which changes the job status. It can be either “approved” or “rejected”. Then it changes to “collected” and ends with “delivered”.
All functions that respond on android application HTTP request are located in the web application RestController class (described in "Spring" section). For a simplicity, it is always considered as a GET request.Values to update or add data to the database are passed through HTTP as GET parameters. This controller contains five functions for different purposes. They are loginJason() and stopcourierjson() to login and logout from application, updateStatusJson() that responds to the button pressed from ShowJobActivity to change the job status. This function uses “id” parameter to locate job by its id, and the “status” parameter, containing value to update the selected job status. The function getDetailsJson() gathers information for ShowJobActivity. It returns JSON data with job details, matched with job id passed as a parameter. The function getjobsJson() returns JSON array with every job related to a couriers username.
PHP section section consists of two Symfony1 project samples, two Symphony2 samples and three plain PHP projects for a developed web crawler, web page ranking and Apache web server log-file analysis.
With the Symfony1 I had developed the Control Management System (CMS) which was adapted and implemented into a variety of websites. The idea of a CMS is simple - every article on the page is related to one menu (category). Every category can have many or one subcategory, which also has one article. Additional articles were considered as an "outside slots" and could be linked to the page. User access was implemented by "fos_user" plugin. Administrator rights have an option to add / delete new categories, etc. As a database was used MySql. Relations in-between tables were established via ORM, at the beginning, it was a "Propel", later changed to "Doctrine" models. According to a specific project to which the CMS was added, there was adopted galleries, google maps, web services, etc.
(The comments are in the right side column)
One of the still working projects, developed by CMS for Symfony1, belongs to the ornithologist from the Latvia. The CMS is used for the maintenance of site content, plus additionally it has a sophisticated table structure for bird photo galleries, different for every specie and other parameters. Additionally, there are different galleries for animals and separate photo gallery for expeditions. Every species has an archive with birds voice records. Although every species has a chronicle archive with GPS coordinates where the bird has had a ring and been later spotted, presented with the Google maps.
This is alive project, and the code is not public.
Following the appearance of PHP5 the Symfony team release Symfony2 framework which has a totally different approach and structure of the code as Symfony1. This example, contains my CMS rewritten with Symfony2. It has additional features, such as possibility to split web site in different languages. Each language is able to hold a different set of menus. The article is still holding "id" from the menu, plus has language identifications. Therefore, they can be filtrated according on locale. CMS has an additional possibility of its parts being loaded asynchronously. The URLs can be manually changed to increase SEO. It also has an option automatically generate SEO friendly URLs from the text. The small improvement, like a draggable administrator menu, better HTML editor etc.
(The comments are in the right side column)
The working project built on top of the CMS for Symfony2. It is a small website for minibus hire company The site has a manageable front page slide show. It holds different sub-pages, such as "minibus hire in Harlow", "minibus hire in Sawbridgeworth", etc., to increase the online visibility. It has a manageable quoting system, Google JS chart to represent percentage of various statistics gathered from information included with the quote. This site does not exist anymore, however, I have kept the replica.
A web crawler written while being a student for the module "Search engines and web navigation". The program collects the array of links from web pages with the depth of 5, beginning from the root. The entry in the array must be unique. Links must be from the root: ‘http://www.dcs.bbk.ac.uk/~martin/sewn/ls3/’. If they are disallowed in probably existing robots.txt file they got appendix marked with the star (*). The root (beginning of crawling) is embedded in code. When collection is done script loops through every array element (link), and if it is allowed in robots.txt, downloads all consisting links from the web page. Output is on a screen plus saved in results.txt and crawl.txt files. Both have download links at the end of the screen.
(It may take a while)
In this example is performed basic link analysis and calculated PageRank. The first stage is to represent a section of the web as an adjacency matrix and use it to find various linkage statistics. The second is to use this adjacency matrix to calculate PageRank scores for the pages. From beginning the program initialises two dimensional array [500x500] and fills it up with 0 values. Then it loops through text file and collects only all “Visited” links. Then saves them in a one dimensional array “visited”. Further sequence of this array is used as index values for links association. The program is transferring relative links to absolute and is removing duplicates. (For example, one page is holding two links to the same page). Because the matrix is huge, to eliminate any mistake, the program is collecting incoming and outgoing links with two different algorithms. All incoming and outgoing links to every page are known, so the sum of links in both calculations must come up with the same amount. To collect incoming links – with every “visited” link it goes through a given text file, corrects relative links to absolute ones with the base from previous “visited” link. If it founds somewhere between “out” links the same link as a current ”visited” link, it presumes it as “in” link to current visited page. It also checks for duplicates. To collect outgoing links – It loops with every ”visited” link through all text pages, and if there are matches it reads all following links until next “ visited” link. It also removes not visited pages, transfers relative ones to absolute links and eliminates duplicates. This algorithm is quicker. After the matrix is loaded with a binary representation of the graph, it is possible to apply “transpose” function in order to swap columns to rows. After that program is looping trough matrix and representing “in” and “out” links on the screen and also in the file. Then it makes calculations for mean, variance and standard deviation. Finally, it calls “rank” function, passes T value as argument and calculates PR’s from a given formula.
Demonstrates how the interaction of a user with the web through a web browser is recorded in the server log files. All used data is standardised, extended web log file output from (allegedly) http://www.i-resign.com web site. Logged information being taken out of pre-saved and cleaned text file “18jan-logdata.txt”. Further data being transformed as one, 12 column large table, and saved in MySql database. All requests are written in SQL and executed with PHP.
a) The top ten IP addresses (or users) who requested the most URLs. http://php.raitis.co.uk/lab5/a.php
b) The top ten files/pages requested. http://php.raitis.co.uk/lab5/b.php
c) The top three most active hours (most requests per hour). http://php.raitis.co.uk/lab5/t.php
d) The top ten referrers. http://php.raitis.co.uk/lab5/d.php
e) Request status responses. http://php.raitis.co.uk/lab5/e.php
f) Data of user’s agent (browser, operating system etc.). http://php.raitis.co.uk/lab5/f.php
The following project is a portfolio website for a graphic designer. The main task was to use asynchronous page loading while not loosing any SEO capabilities. All visual effects were supposed to be made only with the JS JQuery library.
The main structure of the site works by loading its content with loadContent() function. Different parts of the page are located into separate php files (_logo.php, _popart.php etc.). To obtain a good SEO results the each link has an escaped fragment marked with "#!", so in case the page is crawled by search engine robots it would be using asynchronous load from reflected page. This website has had a recognisable achievement in a search optimisation area. The designer name is Ilva, and main search key words were "ilva design". Unfortunately ILVA is also a name for a chain supermarket selling design furniture in Denmark. Despite of that search results were in the Google first page, sometimes by having better results than the supermarket (at that time Google didn't have separate search results according to region).
The page also has JQuery based effects - animator, add / remove css style, "galleriffic" photo gallery, opacity rollover, JQuery based validated contact form etc.
The whole idea was to write an xsl stylesheet which would style the xml documents with the structure like nobel.xml or booker.xml files (download links at the example). This stylesheet should be able to make a filtering by a given XPatx predicate, do a sorting and change a sequence of elements. Next step was to write JS code which would consider the stylesheet as an xml file and change a variety of DOM objects to make a web page with an input field as a changeable source.