Youtube channel !

Be sure to visit my youtube channel

Friday, December 06, 2019

How to prevent Burnout in the IT


Let's first think about what might cause it:
Burnout is a more and more frequent condition in the IT-sphere and is usually caused by working in a toxic work culture which includes:
- not enjoying the colleagues double-faced behavior, or what they do with their money :)
- colleagues who might be trying to get you out of their career path
- experiencing a non-supportive mixed joking / passive-aggressive behavior from supervisors, without being able to respond
- not enjoying the technologies working with such as outdated software, frameworks or languages
- trying to keep up with the pace by learning, and at the same time thinking if you don't know in details about the latest updates, you won't fit inside the team and will be fired
- trying to schedule your rest/work time, by being flooded at an unexpected time with task tickets (time-restricted)
- not having the opportunity to express your view on certain subjects, or when expressing them they are left without a response
- thinking that the whole IT sphere is getting more and more filled with people, who can as well as pick potatoes if the job is well payed
All those can quickly build up and you may start to have repetitive negative thoughts about doing the work and at the same time trying to be cheerful around the colleagues - a form of cognitive dissonance.

The following are my remedies:
You'll need to learn to quickly how to balance between your focused and default brain network:
Have at least 1 hour of walks in fresh air.
Here is some immediate basic exercise, you can do without postponing :)

Do sports, running combined with positively reinforced repetitions of I am ok, I am healthy... fill in some empowering words that work for you...
Work on a standing desk in order to reinforce your focused network, and having breaks as much as you would like. Remember your job and that of an HR is not of the same difficulty :) If, on a budget, you can do standing desk by yourself using some books or cardboard boxes.
Optional: extensively use high-quality headphones
About performing tasks: write the task, in the form of a mini and detailed plan on how to do the task and then just relax and have fun. When it is time to choose to do a particular task, just open the plan and do it, without having to think or improvise ways on what is the proper way to do the task or which task to choose.
Lastly - simply quit your job.

Event delegation in JavaScript

In JavaScript we use event delegation for two main reasons:
1) to be able to process events coming from dynamically added page elements.
2) to achieve better application performance by listening to only one main element instead of having to add and remove event listeners each time a new element is dynamically added to the DOM.
More intriguing JavaScript aspects you can discover inside the JavaScript for beginners - learn by doing course.

 


In case of event delegation, when event occurs, we can filter out on which of the event generated source elements we would like to respond to using the event's target property: event.target
Here are two examples on how to use the event target:
// to filter out the processing of events coming from elements having different classes:
if (!event.target.classList.contains('my_class')) return;
// or to do filtering based on particular tag:
if (event.target.tagName == 'INPUT') {
// do our processing only if the source event came from an input tag
}

Example: Here we create an html with a simple ul/li list and add new LI button:
<button id="add">Add new LI</button>
<ul class="characters">
    <li>
        <input type="checkbox" data-index="0" id="child0">
        <label for="child0">Child 0</label>
    </li>
    <li>
        <input type="checkbox" data-index="1" id="child1">
        <label for="child1">Child 1</label>
    </li>
    <li>
        <input type="checkbox" data-index="2" id="child2">
        <label for="child2">Child 2 </label>
    </li>
</ul>
<script src="event_delegate.js"></script>

And here is our JavaScript code which adds new LI elements dynamically:
document.querySelector('#add').addEventListener('click', () => {
 // we create a new li element
let li = document.createElement('li');
// we get the last data-index attribute withing ul>li
    let dataId = characterList
        .lastElementChild // we use ElementChild instead of Child, because it ignores text and comment nodes
        .firstElementChild
        .getAttribute('data-index');
// increase the last id with 1
    ++dataId;
// construct the ne LI element with increased dataId
    li.innerHTML = `
    <input
    type="checkbox"
    data-index="${dataId}"
    id="child${dataId}">
    <label for="child${dataId}">Child ${dataId}</label>
    `;
// append the new LI eleemnt to the UL
    characterList.append(li);
});

// this is our little debug function showing up the event and where it is coming from
function toggle(event) {
    console.log('event: ' + event); //the event
    console.log('target: ' + event.target); //where occured the event
    console.log('currentTarget: ' + event.currentTarget); //attached parent element
    console.log('toggled element with id: ' + event.target.id);
}

// the actual parent element, where the event delegation is happening
const characterList = document.querySelector('.characters');
// here, instead of having multiple LI attached event listeners, we just attach 1 event listener to the UL list
characterList.addEventListener('click', toggle);

// Bonus: here is how to query specific element using its data-index attribute:
console.log(document.querySelector('input[data-index="1"]'));

Congratulations!

Resources:

JavaScript for beginners - learn by doing

Tuesday, November 26, 2019

One way data binding in JavaScript

Alright, since two-way data binding is often used in Angular, React and Vue and at the same time, not everyone likes it. I think that simple one-way data binding when writing JavaScript code is actually beneficial. More intriguing JavaScript aspects you can discover inside the JavaScript for beginners - learn by doing course.


What is one-way data binding: simply-said we would like the moment we change our data variable (model) this change to reflect immediately inside our generated HTML. Let's take a look at this example:
We have two span elements with two data bindings quote1 and quote2. You may consider them as just ids:
<span data-binding="quote1"></span>
<br>
<span data-binding="quote2"> </span>

Then we create the HTML change function (render) that will grab the element that is passed as a "property" parameter and will change its innerHTML based on what is inside the same property, taken from a special state object. This way effectively changing the HTML based on a passed parameter.

const render = property => {
document.querySelector(`[data-binding="${property}"]`).innerHTML = state.property;
};

Now to the main function setState. Its goal is to create a Proxy around a passed state obect, and whenever property of an object inside this state changes (i.e. user puts information inside (set)), the proxy will perform certain things, such as: setting the updated value inside the property, as well as using the previously defined render function to update the HTML view):

const setState = state => {
return new Proxy(state, {
set(target, property, value) { // detect if there are changes inside the property of a certain object
target.property = value; // updates the value of the property
render(property); // renders the HTML with the updated property to the screen
}
});
};

Now lets set an initial state:
const state = setState({
quote1: 'Initial quote state.'
});

Once it is set we can display the initial state on the console: console.log(state.quote1);

We can now try to modify some properties inside the Proxy state:
state.quote1 = 'quote1 new state';
state.quote2 = 'quote2 new state';
and see how their updated changes are reflected on the browser HTML.

You now may use the Proxy design pattern for your projects. 

Congratulations!

Resources:

JavaScript for beginners - learn by doing

Monday, November 25, 2019

Create REST API with NodeJS

An introductory part of the Learn Node.js, Express and MongoDB + JWT course.


First, we will install the latest npm and nodejs version from nodesource: https://github.com/nodesource/distributions

curl -sL https://deb.nodesource.com/setup_13.x | sudo -E bash -
sudo apt-get install -y nodejs

Then we will create a directory API, where we will set up our project:
mkdir api
npm init
(please provide project name, description, and author name as information)
Then is time to install some required packages:
We will install express in order to act as a web server together with nodejs: npm i express
Open the package.json file and notice how now express appears under the dependencies section.

We will also install MongoDB as well as MongoDB as our database and a mongoose DB connection helper with npm i mongodb mongoose
Then we will do a babel installation. Babel (/preset-env) will transpile (convert) our code so it can become cross-browser compatible, this is especially useful when you want to achieve better JavaScript support. Babel (/node) will provide understanding to nodejs of the modern ES features we will be using inside the index.js file.
npm i --save-dev @babel/preset-env @babel/core @babel/node
Now if you open package.json you will see that babel is installed only as a development dependency. Before start using babel we will configure it, just create a file .babelrc with the following content:
{
"presets": ["@babel/preset-env"]
}
Here we are just preparing the Babel transpiling to be suited for specific browsers.

We will also install nodemon to be able automatically to monitor and refresh the nodejs server (loading the new code) when we make changes to our code.
npm i nodemon --save-dev
/* Later we will need  body-parser to be able to parse when receiving as well as send application/json type of requests: npm i body-parser
*/

Now let's change the starting script of package.json to be able to run the upcoming index.js file. We will change it from 'test' to:
start: 'nodemon --exec babel-node index.js" //note the double --
With this change, we now can just type: npm start and it will start recompiled / transpiled version of index.js (using babel) so we can then browse the generated version of index.js via our browser.

In order to be able to browse the output of the index.js file, we will build a server that will serve it. The server will be bound to the localhost IP address and will listen to a specific port.
Create index.js with the following content:
import express from 'express'; // we import the express library
// since we are using babel the import function will be recognized in browsers which are not supporting ES6 imports
const app = express(); // we initiate a class object from the express library
// note the usage of const -> we will not change the app variable later so const is well appropriate to be used here
const PORT = 4000; // let's define a listening port
we will also create our first request route point / :
app.get('/',(req,res) => res.send("server running"));
// basically we say when you have request for the root url / serve them message: "server running"
app.listen(PORT,()=>console.log('listening on ${PORT}));
// here we are just listening to port 4000 and displaying information to the console where the express server starts running.

Now we can just type: npm start and browse inside: http://localhost:4000 to see our application running!

Ok, let's use mongoose to connect to our MongoDB database:
import mongoose from 'mongoose';
then we will connect with the following options:
mongoose.connect(
'mongodb://localhost/my_db',
{
useNewUrlParser: true,
useUnifiedTopology: true
}
);

...
We will proceed with a standard MVC pattern, where we will be having routes that will get browser requests directing them to specific controllers, controllers that will take care of the logic and will use models to perform various actions on the MongoDB database. Index.js will load up everything from a new subdirectory /src/, where our routes, models, and controllers will reside.

Routes
Now it is time to create our routes properly. We will create directory /src/routes/ with file routes.js inside
there we will define and export the routes for our application like so:
const routes = (app)=>{
 app.route('/user')
.get((req,res)=>res.send('Getting information about all users'))
.post((req,res)=>res.send('Creating new user'));
 app.route('/user/:userID')
.get((req,res)=>res.send('Getting specific user by ID'))
.put((req,res)=>res.send('Updating user by ID'))
.delete((req,res)=>res.send('Deleting user by ID'));
}
export default routes;

inside index.js we will import the created routes;
import routes from '/src/routes/routes';
then we will load up those routes with:
routes(app);

Using MongoDB
We will create a schema for of how our documents inside MongoDB we would like to look like, inside the /models/model.js file:
import mongoose from 'mongoose';
const Schema = mongoose.Schema;
export const UserSchema = new Schema(
{
firstName: {type:String,required: 'Enter firstname'},
lastName: {type:String,required: 'Enter lastname'},
email: {type:String,required: 'Enter email'},
created_at: {
type: Date, 
default: Date.now
}
});

Alright, we will use this UserSchema, when gradually creating our controllers.
But before all this, lets recognize requests we will be working with (inside index.js):
// we will be using the built-in middleware functions in Express, so we can parse requests having JSON payloads inside
to parse the request text of type: application/json
app.use(express.json());
and to parse the request text of type application/x-www-form-urlencoded
app.use(express.urlencoded({ extended: true }));

Then we create a directory with file: controllers/controllers.js
import mongoose from 'mongoose';
import {UserSchema} from '../models/model.js'; //so we will be able to use our newly created UserSchema.
const User = mongoose.model('User',UserSchema); // here we create 2 things: 1) User constant which will hold a reference to the User model, 2) which we created from the provided UserSchema.

We will now create our first function that will work with MongoDB and instruct the model to create new User.
export const addNewUser=(req,res)=>{
let newUser = new User(req.body); // we use the whole request body to create a new User model based on the UserSchema.
// Once we have the model we can use its functions such as save, find, findOneAndUpdate and others, which are provided from MongoDB.
newUser.save((err,user)=>{
if (err){res.send(err);} // we send an error if we couldnt create the db document
 res.json(user); // if everything went well, we output the created document as a json format
})
}

now we go back and modify our routes.js to be able to use the newUser function:
1) import addNewUser: import {addNewUser} from 'controller';
2) change .post method to: post(addNewUser);

Next we copy the newUser function and will modify it to create function that will get all of our Users:

export const getUsers=(req,res)=>{
User.find({}(err,users)=>{ // note here we are finding all users via the {} empty search condition, also note that we don't need to create another instance of User, we just use the already created one above.
if (err){res.send(err);}
 res.json(users);
})
}
Now again in routes.js we import the getUsers function, and change the .get method to:
.get((req,res)=>getUsers);
And by the way you can use PostMan to test if the routes are working correctly !

Now for the user/:userID routes:
// we will use mongoDB .findByID() to get specific user by its ID
export const getUserByID=(req,res)=>{
User.findById(req.params.userID, (err,user)=>{ // note here we are getting the  userID from the request parameters
if (err){res.send(err);}
 res.json(users);
})
}
don't forget to update the get method under the '/user/:userID' route in order to activate the getUserByID function: .get(getUserByID)

Let's update some users:
export const updateUser = (req,res) => {
User.findOneAndUpdate(
{_id:req.params.userID}, // here we will be searching first by _id which equals to the passed inside of :userID
req.body, // we pass the request containing the updated information
{new:true, useFindAndModify:false}, // with new:true we will be returning the newly updated user to the res.json
(err,updateduser) =>{
if (err){res.send(err);}
res.json(updateduser); // we send as json the updated user information

});
}

Try to create the deleteUser function by yourself, it will be using the userID parameter just the way updateUser function did. Hint: you can use the .remove() MongoDB function!

Congratulations and enjoy learning !

Using MongoDB Atlas Database as a Service

Atlas is an example of a Database as a service that is located in cloud space and is being offered from MongoDB. Let's see how we can use its functionality.

Note: if you want to explore more REST APIs you might find this course interesting: Learn Node.js, Express and MongoDB + JWT


Setup an Atlas account
First will create a free account inside the MongoDB Atlas. We will then login and create our first cluster from -> build a new cluster. Inside a cluster you can have lots of databases and their replicas that serve as their backup in case of failures as well as can be located in multiple places thus allow easier scaling of the database.
Choose the available clusters marked with a free tier. Click on the Security tab and then on the Add New User button. From there create a user with a password, providing read and write permissions to any database. Then we need to whitelist our IP address in order to connect to the cluster. If you don't know your external IP address for development purposes you can just type 0.0.0.0 to allow connections to Atlas from everywhere or click on the Button Add the current IP address to whitelist your IP address.
Next, we will click on the Connect button, which will reveal to us the different ways to connect to the Atlas cluster. Choose Connect to your Application. From then choose Short SRV connection string. You can now copy the revealed SRV address.

Connect to Atlas
Installing packages
1. Install npm and node from nodesource repository: https://github.com/nodesource/distributions
2. Create an empty project directory and inside type npm init -> to initialize our project (type in your name, the description of the project and other details you prefer)
// 3. Install the CORS package - to be able to make requests without cross-origin restrictions: npm install cors. You can open package.json and see the cors package listed inside the dependencies section.
// 4. Install express - to be able to create and run a local API server
5. Install mongoose - for the connection to Atlas: npm install mongoose
// 6. install body-parser to be able to work with specific parts from the body of the HTTP request: npm install body-parser

Setup of authentication configuration keys
modify package.json file to use:
"scripts": { 

 // "start:dev": "./node_modules/.bin/nodemon server.js",
 "start:prod": "NODE_ENV=prod node server.js", 
}
The second line says that we will be setting a production environment, while starting our main server.js file , and will be able to start our app with npm start:prod .

Then create keys.js file with:
if (process.env.NODE_ENV == 'production'){
module.exports = require(__dirname__+ '/keys_prod');
}else{
module.exports = require(__dirname__+ '/keys_dev');
}
We basically load up two files based on what kind of environment we are using (production or development). Then create and inside of keys_prod.js file place the connection string you've got from the Atlas website above.
Replace with your Atlas user password.
Grab the first cluster information and place it inside of mongoURI key like so:
mongoURI: 'mongodb://user:password@cluster0-shard-00....'
The other parameters nest inside mongoCFG key:
mongoCFG:{
useNewUrlParser:true,
ssl:true,
replicaSet:'Cluster0-shard-0',
authsource:'admin',
retryWrites:true
}

The API server
It will be used to respond to requests and connect to Atlas. Create server.js file and inside fill in the following information:
// initially we require the already installed packages
//const cors  = require('cors');
//const express = require('express');
const mongoose  = require('mongoose');
//const bodyParser  = require('body-parser');

// get the connection parameters
const config = require(__dirname__+'/config/keys');
// create new express application server
var app = express();
//connect to MongoDB atlas using mongoose and our connection configuration
mongoose.connect(config.mongoURI,config.mongoCFG).catch((error)=>console.log(JSON.stringify(error)));

Run the code with:
npm start:prod
// node /server.js
You should see that you are connected to the remote Atlas DB server!


// To test the express API endpoints you can use Postman, you will also need to create a postman account.
// Download the application, choosing your OS platform. Then for Ubuntu just type:
// tar -zxvf postman.....tar.gz
// then enter inside the newly created directory Postman, then run ./Postman

Congratulations!

Subscribe To My Channel for updates

Things to do after install Fedora 43

--- ##### SYSTEM UPDATES & FIRMWARE   sudo dnf upgrade --refresh -y   ##### Check Firmware (Only if supported hardware is found)   fwupd...