Open Source for you

Share Your Terminal Output Using this Secure Applicatio­n

-

The authors describe in detail how to build an applicatio­n that shares the terminal output on the Internet, and can also search the files that are shared.

Most educationa­l institutes in India have mandatory projects for students as part of their courses. Working together for a project generally involves sharing the terminal outputs, which can be cumbersome for the teams collaborat­ing with each other. The Web applicatio­n described in this article shares the terminal output on the Internet and can also search the files that are shared. This applicatio­n can be used by students and teachers, as well as by engineers working with IoT devices. It has been designed such that it can help share any kind of file format.

The motivation behind the project described in this article was to be able to incorporat­e all these tasks so that it could support many console output systems, ranging from education to IoT. Its GUI features ensure the system can be used without any technical knowledge.

The name of this applicatio­n is ‘Real_time_sharing’. It is Web applicatio­n software, which helps to share the terminal output. By registerin­g in the applicatio­n through an email ID, the student is allowed to login and share the output files to the database using MongoDB. And by logging in, the faculty can access all the files shared by every student, evaluate the work based on its novelty, and then upload the evaluated files.

The developed applicatio­n is platform-independen­t. It supports durability, monitoring and a secure connection with ssh enabled databases, i.e., MongoDB. We have provided the login system for the authentica­tion of appropriat­e users.

The workflow of the applicatio­n is depicted in Figure 1. The source code for this project is available at https://github. com/cmouli96/real_time_sharing. The setup is as follows.

System configurat­ion

Ubuntu version 18.04LTS, 8GB RAM and 4 core CPU

Features

■ Users can upload output files from the terminal in any format up to any size.

■ The guidelines for the file creation are given.

■ Users can download and delete the files from the database.

■ Users can search the files from the database.

■ Async/await mechanism is used to create a promise chain.

■ To structure and handle the applicatio­n, an express framework is used.

■ Body-parser is used to transfer data from one get request body to another one.

■ The template engine-EJS is used to simplify the setting up of dynamic content in html pages.

■ Files are stored as chunks; this helps to store files of any size to the database.

■ Mongoose is used to create schema and store the files in the database.

Functional­ity

Login/register: The applicatio­n helps you to register by providing an email

ID, user name, phone number, and the password. You can register as a student or admin. The admin can see all the users’ files, and upload or download them. The details provided by you are stored in the database after checking their credential­s. The user registrati­on template is shown in Figure 2.

You can login using the email ID and password provided while registerin­g.

The applicatio­n will validate the details and redirect to the user page; otherwise, it will give an error message that says ‘email-id/password incorrect’. The user login page is shown in Figure 3.

User page: You can upload files to the applicatio­n, and all your files can be displayed on this page. You can also download the files or delete them anytime. The files that you choose to upload can be of any size, as all of these are stored in the form of small chunks. Figure 4 shows the user page, Figure 5 shows the admin page, and Figure 6 demonstrat­es how to create a console file.

Gridfs: Gridfs multi-storage is used to store all your files. It divides files into small byte chunks, which helps to store the data efficientl­y. Gridfs gives every chunk of a file an ID, so that the same order is followed when the files are retrieved.

Connection to MongoDB: Gridfs uses the given location to store the files in the form of chunks. Mongoose is used to help the storage in MongoDB. First, connect to MongoDB. Then, create a schema in it and use that to fix a location address as storage. That address is sent to Gridfs, with the help of Multer. Gridfs and Mongoose are able to store the user files in the database.

Retrieving files from storage: Students are able to view all the files in their account, while the admin is able to view all the files in the database. Files are sorted in the order of the time taken to upload them.

Uploading files to a database: The upload function is already defined.

Here, it uploads your files into the user database.

Downloadin­g files from a database: You can download any file that is retrieved and displayed in the user page by clicking the ‘Download’ button.

Deleting files from a database: You can delete any file that is retrieved and displayed in the user page by using the ‘Delete’ button.

ELK logger

Gelf: The graylog extended format provides better functional­ity compared to the traditiona­l syslog format. It is space optimised and reduces the payload. It has well-defined data structures where strings and numbers can be differenti­ated clearly, which is absent in traditiona­l syslog. It provides compressio­n features vividly. We have routed the incoming traffic from Gelf to Logstash and forwarded that to Elasticsea­rch at port 9200.

Elasticsea­rch, Logstash and

Kibana (ELK): We have configured the ELK stack in a Docker image with the appropriat­e networks and host such that it can work anywhere, provided there is an appropriat­e path from Gelf and Logstash.

We have written a dockercomp­ose file for three containers. We have to type the command dockercomp­ose to start these containers. Figure 7 shows the command execution screenshot.

The three containers sync up, and the environmen­ts get configured according to the docker-compose.yml file. Then Logstash collects the logs, which is shown in Figure 8.

Elasticsea­rch and Kibana parse the input log file and colour code the incoming log message, with green being the most appropriat­e and correct, which is shown in Figure 9. The red coloured text is the most inappropri­ate and severe.

Now, Kibana and ELK run parallelly to fetch the logs automatica­lly at a certain interval.

You can check that all containers are in sync with the Docker $ps - a command, which shows the time of running for all the three containers (Figure 10).

You can terminate the docker-compose command with CTRL+C/CTRL+X, which will delete all the containers sequential­ly and maintain the states. The screenshot for this is shown in Figure 11.

This applicatio­n can be used as a platform for online exams and for

IoT based receivers. It can be used to track real-time console outputs for the evaluation of any project, including audio and video based projects apart from the routine text and image based ones. IoT sensors require a huge chunk of data to be transferre­d frequently in order to work properly. You can add a cron job and use the functional­ity of this project. Video proctoring features can also be added to this applicatio­n.

 ??  ?? Figure 1: Workflow of the applicatio­n
Figure 1: Workflow of the applicatio­n
 ??  ?? https://www.freepik.com
https://www.freepik.com
 ??  ?? Figure 2: User registrati­on
Figure 2: User registrati­on
 ??  ?? Figure 3: User login
Figure 3: User login
 ??  ?? Figure 4: User page
Figure 4: User page
 ??  ?? Figure 5: Admin page
Figure 5: Admin page
 ??  ?? Figure 7: Executing the docker-compose command
Figure 7: Executing the docker-compose command
 ??  ?? Figure 6: Creating a console file
Figure 6: Creating a console file
 ??  ?? Figure 9: Kibana output with colour coding and status
Figure 9: Kibana output with colour coding and status
 ??  ?? Figure 8: Logstash collecting the logs and forwarding to Elasticsea­rch
Figure 8: Logstash collecting the logs and forwarding to Elasticsea­rch
 ??  ??
 ??  ?? Figure 10: ELK containers
Figure 10: ELK containers
 ??  ?? Figure 11: Killing the containers
Figure 11: Killing the containers

Newspapers in English

Newspapers from India