Youtube channel !

Be sure to visit my youtube channel

Monday, November 25, 2019

Using MongoDB Atlas Database as a Service

Atlas is an example of a Database as a service that is located in cloud space and is being offered from MongoDB. Let's see how we can use its functionality.

Note: if you want to explore more REST APIs you might find this course interesting: Learn Node.js, Express and MongoDB + JWT


Setup an Atlas account
First will create a free account inside the MongoDB Atlas. We will then login and create our first cluster from -> build a new cluster. Inside a cluster you can have lots of databases and their replicas that serve as their backup in case of failures as well as can be located in multiple places thus allow easier scaling of the database.
Choose the available clusters marked with a free tier. Click on the Security tab and then on the Add New User button. From there create a user with a password, providing read and write permissions to any database. Then we need to whitelist our IP address in order to connect to the cluster. If you don't know your external IP address for development purposes you can just type 0.0.0.0 to allow connections to Atlas from everywhere or click on the Button Add the current IP address to whitelist your IP address.
Next, we will click on the Connect button, which will reveal to us the different ways to connect to the Atlas cluster. Choose Connect to your Application. From then choose Short SRV connection string. You can now copy the revealed SRV address.

Connect to Atlas
Installing packages
1. Install npm and node from nodesource repository: https://github.com/nodesource/distributions
2. Create an empty project directory and inside type npm init -> to initialize our project (type in your name, the description of the project and other details you prefer)
// 3. Install the CORS package - to be able to make requests without cross-origin restrictions: npm install cors. You can open package.json and see the cors package listed inside the dependencies section.
// 4. Install express - to be able to create and run a local API server
5. Install mongoose - for the connection to Atlas: npm install mongoose
// 6. install body-parser to be able to work with specific parts from the body of the HTTP request: npm install body-parser

Setup of authentication configuration keys
modify package.json file to use:
"scripts": { 

 // "start:dev": "./node_modules/.bin/nodemon server.js",
 "start:prod": "NODE_ENV=prod node server.js", 
}
The second line says that we will be setting a production environment, while starting our main server.js file , and will be able to start our app with npm start:prod .

Then create keys.js file with:
if (process.env.NODE_ENV == 'production'){
module.exports = require(__dirname__+ '/keys_prod');
}else{
module.exports = require(__dirname__+ '/keys_dev');
}
We basically load up two files based on what kind of environment we are using (production or development). Then create and inside of keys_prod.js file place the connection string you've got from the Atlas website above.
Replace with your Atlas user password.
Grab the first cluster information and place it inside of mongoURI key like so:
mongoURI: 'mongodb://user:password@cluster0-shard-00....'
The other parameters nest inside mongoCFG key:
mongoCFG:{
useNewUrlParser:true,
ssl:true,
replicaSet:'Cluster0-shard-0',
authsource:'admin',
retryWrites:true
}

The API server
It will be used to respond to requests and connect to Atlas. Create server.js file and inside fill in the following information:
// initially we require the already installed packages
//const cors  = require('cors');
//const express = require('express');
const mongoose  = require('mongoose');
//const bodyParser  = require('body-parser');

// get the connection parameters
const config = require(__dirname__+'/config/keys');
// create new express application server
var app = express();
//connect to MongoDB atlas using mongoose and our connection configuration
mongoose.connect(config.mongoURI,config.mongoCFG).catch((error)=>console.log(JSON.stringify(error)));

Run the code with:
npm start:prod
// node /server.js
You should see that you are connected to the remote Atlas DB server!


// To test the express API endpoints you can use Postman, you will also need to create a postman account.
// Download the application, choosing your OS platform. Then for Ubuntu just type:
// tar -zxvf postman.....tar.gz
// then enter inside the newly created directory Postman, then run ./Postman

Congratulations!

Sunday, November 24, 2019

Observer and subscriptions in JavaScript

Observers are widely used in JavaScript frameworks such as Angular, libraries like RxJS and others. That is why it is good to know what they do under the hood. The same principle is being successfully utilized by the JavaScript event listeners. More of the intriguing JavaScript aspects you can discover inside the JavaScript for beginners - learn by doing course.

In general, we have a list of subscribers that subscribe to certain events. When certain event emits data, all of its connected subscribers receive the data. The role of an observer is to observe and distribute events to proper subscribers. The simple action of subscribing to an event is referred to as a subscription.


Now to the code:

// we create a class: Observer
class Observer {
inside of its constructor we set up an initially empty list of subscribers.
constructor() {
this.subscribers = [];
}

// function subscribe, just adds subscriber to the subscribers' list
subscribe(subscriber) {
this.subscribers.push(subscriber);
}


// optional: unsubscribe from a particular event, we just remove a subscriber from the subscribers' array. Note that we need to pass as a parameter the whole subscriber object structure containing event and action, to be compared with the existing subscribers inside the this.subscribers array. This is because we would like to distinguish between multiple subscribers for the same event.
unsubscribe(subscriber) {
this.subscribers = this.subscribers.filter(subscriber => subscriber !== subscriber);
}


// inside the publish function, we do several things
publish(event, data) {
 // we search for those subscribers which are subscribed and respond to the same event as the passed function parameter (event)
this.subscribers
.filter(
subscriber =>
subscriber.event === event
)
// then for each list of the found subscribers we propagadate the data parameter i.e. send information to them
.forEach(
subscriber =>
subscriber.action(data)
);
}

}

Now let's see the observer usage in action:

// we first construct object out of our observer class: my_observer
const my_observer = new Observer();

// then we subscribe to event 1, we actually set up the event we will be listening to, as well as what kind of action to be performed when the event is received
my_observer.subscribe({
event: 'event 1',
action: (data) => {
console.log('received event 1', data);
}
});

// now to subscribe to another event
my_observer.subscribe({
event: 'event 2',
action: (data) => {
console.log('received event 2', data);
}
});

// when we have the subscriptions, we will fire up 2 events to event 1, using a delay via setTimeout()
setTimeout(() => {
my_observer.publish('event 1', 'Sending data to event 1 listeners/subscribers');
}, 1500);

// and lets fire up one event to the event 1 subscriber
setTimeout(() => {
my_observer.publish('event 1', 'Sending data to event 2 listeners');
}, 2500);

We should be able to see the 3 events received and being responded to.
Please test the code to see the observers working in action.

Congratulations!

Resources:

JavaScript for beginners - learn by doing

Saturday, November 23, 2019

Store state management with RXJS BehaviorSubject

Basically, we will be creating and using an observer with the help of RXJS library. If you would like to see more of the RXJS in action you can take a look at this Angular course.


It is important to notice that the store will keep the immutable state principle. Which means that all the operations we are about to do on the stored objects inside will not modify the store itself, but will instead produce and return a new store.
Using immutability brings benefits such as preventing data race conditions as well as working with a different from the latest state of the same object when several observers are interacting with it.

For the demonstration, we will create 2 HTML buttons, which when triggered will increment or decrement the values of #inc element data
<button id="inc">+</button> <button id="dec">-</button>

<span id="state"></span>

#state will respond to changes inside of our observed data.

Then, to use behavioursubject we will import it from the rxjs library. It has the ability to save the last emitted value inside itself (thus imitating state) and when a subscriber asks for the saved data (subscribes to the behavior subject) the behavior subject will emit it.

import { BehaviorSubject } from "rxjs";

// we create a class Store with initial state object which has data inside( name and votes). All this initialization is done inside the constructor of the class.
class Store {
  constructor() {
    this.initialState = {
      data: [
        {
          name: "initial name",
          votes: 0
        }
      ]
    };

// we create a new behavior subject and load up the initialState data inside.
    this.subject$ = new BehaviorSubject(this.initialState);

// we create an observable from the subject, in order to be able to access the data from the behavior subject as read-only (i.e not to be able to write and populate values to other subscribers, thus creating chaos inside the data logic).
    this.state$ = this.subject$.asObservable();
  }

// We then use two functions (getters and setters, which get and set the state inside the behavior subject. (with .next() we emit values to the subject, .getValue() is not used very often, and here is just to get the current value inside the subject)
)

  get state() {
    return this.subject$.getValue();
  }
  setState(nextState) { // the function can be also defined as: set state()
    this.subject$.next(nextState);
  }
}

// now it is time to create an object from our previously defined class Store();
const store = new Store();

// When we click on the + button we set a new state inside of our store.

// Note that we have data and state, so each state is characterized with its own data. Here we just modify the initial(old) state with our data object {name, votes}.
document.querySelector("#inc").onclick = function() {
  store.setState(
    //add new objects
    {
      ...store.state,
      data: [...store.state.data, { name: "initial name", votes: 0 }]
    }
  );
};

Searching/querying objects inside the store. In our store, we can also have objects with different names. Here is how to do it: we start by attaching an event handler to the click on the + button:

document.querySelector("#inc").onclick = function() {
// this time we can search for a particular named object, we are interested in inside the store
let searched_name = "initial name";
  store.setState(
    {
      ...store.state,
      data : store.state.data.map(store_data => { // we loop through the whole store
        // console.log('data inside the store: '+JSON.stringify(inner_data));
        if (store_data.name === searched_name) {

// and if we have a match we change the corresponding store object by incrementing its current votes property with 1

          return {...store_data, votes: store_data.votes + 1 };
        }
        return store_data; // after the whole mapping logic, we return the modified 'stored_data' variable to be a property of the 'data' variable.
      })
    }
  );

Note: inside setState we return not a modified old store.set object, but a newly created object so keeping the immutability rule true.

// Here is how to delete objects based on searched_name variable. We loop throughout all the store data, compare and return only the objects which are not equal to the deleted value. Once again keep an eye on the immutability principle, that we are returning a completely new object and don't modify the store.state object by deleting its entries.
document.querySelector("#dec").onclick = function() {

 store.setState(
{
      ...store.state,
      data : store.state.data.filter(store_data => {
    return    store_data.name !== searched_name
      })
    }
  );
};
});

Last but not least if we want our HTML to reflect on the changes from the store we subscribe to BehaviorSubject:
store.state$.subscribe(state => {
// we can either display the current state to the console
  console.log("current state: " + JSON.stringify(state));
// or just place it inside our #state span element
document.querySelector('#state').innerHTML = state.votes;
});

...

Here is an alternative version, which achieves a similar functionality:

//initially we set out empty state object {}:
let initialState = {};

class AppState {
// we again create our behavior subject to store the empty initial state
  private stateSubject = new BehaviorSubject(initialState);
//then we create an observable out of the subject
  state$ = this.stateSubject.asObservable().pipe(
    scan((acc, newVal) => { // on every new value that the subject receives, via the scan function, we get it, together with its previous value acc (scan() in rxjs() behaves like reduce in javascript)
      return { ...acc, ...newVal }; // we create a new object consisting of the old accumulator value plus the newValue changes applied
    })
  );

// here is the important dispatch function, which simply via .next() places a new payload into a specific object key, thus ensuring that the particular object will have its new state set.
  dispatch(obj) {
    this.stateSubject.next(
      { [obj.key]: obj.payload }
    );
  }

}

// optionally we can debug the state inside the observable using Angular's async and json pipes:
{{ appState.state$ | async | json }}

// Here is how we can use the dispatch method: we just send new data to a specific key of our state.
this.appState.dispatch({
      key: 'person',
      payload: {
        name: '',
        website: ''
      }
    });

Congratulations and enjoy the Angular course!

Sunday, November 17, 2019

MongoDB introduction

Let's see how to work with the non-relational database MongoDB. If you are looking for more advanced usage of MongoDB inside of a real-life application you can check this course: Learn Node.js, Express and MongoDB + JWT

 


Install the database with sudo apt install mongodb
*optional install the graphical client MongoDB compass:
https://www.mongodb.com/download-center/compass
then do dpkg -i mongodb_compass*

you can then type: mongo to get inside the interactive shell
inside you can type
show dbs
to show all the available databases:
admin   0.000GB
config  0.000GB
local   0.000GB

In mongodb we have the following structures: databases->collections->documents

use db1 switches the internal db() pointer to a different existing database or creates a new one if there is no such found. In order to actually create the database, you will need to place inside at least 1 collection.
So you can just type: db.createCollection('mycollection')
then show dbs will show your new database; the collections can be displayed using: show collections.
if we want to delete database we use: db.dropDatabase()

Since we are using unstructured data we can just place JSON object literals (named documents in MongoDb) inside the collections.
db.mycollection.insert({
field1:'value1'
})


multiple insertions:
db.mycollection.insertMany(
[
{name:'John',color:'red',position:'programmer' }
{name:'Peter',color:'green', position:'craftsman' }
{name:'Maria',color:'blue',position:'gardener' }
]
)

To display all the documents inside of a collection we could type:
db.mycollection.find().pretty()
other examples include:
db.mycollection.find({field1:'value1'})
or just get 1 field:
db.mycollection.findOne({field1:'value1'})
we can search using multiple criteria:
db.mycollection.find({field1: {$gt:5}},{_id:0})
or within subset of values:
{
_id:{ $in: [1,2]} 
db.mycollection.find({field1: {$gt:5}},{_id:0})   
 
we can aslo chain other methods to find such as limit
db.mycollection.find({field1:'value1'}).limit(5)

Update in MongoDB can be performed in two ways:
1. We search for a condition to be met, then we update the whole found document with a new document if the search condition returns a document otherwise the update fails. To update/repace entirely found docment entries we use:
update({filter_condition}, {fields_to_update})
inside filter_condition just specify JSON document key/values to search for, to be replaced by fields_to_update JSON document, for example:
db.mycollection.update({field1:'value1'},{field1:'value2'})
if we use upsert we will insert a new field if field1 search condition doesn't exist
db.mycollection.update({field1:'value1'},{field1:'value2'},{upsert:true})

2. The second way is again to search for condition, but this time to update specific fields from it - this is actually the expected familiar functionality of the MYSQL update function. We use the set operator, in order to just update specific fields:
db.mycollection.update(
{ _id: 1 },
{$inc: { quantity: 5 },
$set: {field1: "value2",}
)
An interesting thing to notice is that the documents have unique ids, so you can search by those ids in order to locate a specific document.
With inc we can increment certain fields inside the document (here the quantity field).
Example: db.mycollection.find({ _id:ObjectId("5dd0fb5988cbe5bb79e7a0e2")  } )

Notes: as filtering conditions you can use $gt:3 (which means >3) or $le:3
If you would like to rename certain keys inside the document, inside update you can us rename $rename{field1:"new_field"}
db.mycollection.update(
{ _id: 1 },
$rename{field1:"new_field"}
)with remove you can remove a document:
db.mycollection.remove(
{ _id: 1 }
)
We can have nested elements inside of the same document such as:
{
"article":{
"title":"my article",
"comments":[
                    {'title':'first comment'},
                    {'title':'second comment'}
                   ]
}
}

like so:
db.mycollection.insert(
{
"title":"my article",
"comments":[{'title':'first comment'},{'title':'second comment'}]
});

In order to search inside the comments we can use:
db.mycollection.find(  
{  
comments:{  $elemMatch:{ title:'first comment' }  }  
}
)
Keep in mind that when searching this way you have to specify exact match for the text fields. If you would like to perform full-text search you can place indexes on the fields you would like to search:
db.mycollection.createIndex({title:'text'})

Let's now insert multiple documents:
db.mycollection.insertMany(

{"_id":1,  "title":"my newest"},
{ "_id":2, "title":"my article 1"},
{ "_id":3, "title":"my article 2"},
{ "_id":4, "title":"my article 3"},
{ "_id":5, "title":"my article 4"},
{ "_id":6, "title":"my article 5"},  
]
);

*Optional:
db.mycollection.getIndexes() will display the indexes
from there you can find the name of the index and use: db.mycollection.dropIndex('comments_text') to remove it.
 

then use db.mycollection.find(  {  $text:{  $search: "\"article \""  }  })
to search inside the indexed field. You should escape the search string \" if doing a phrase search.
Note: In order to ensure that Fulltext search to work, you should aim for longer than 4 characters words to have inside the database!

Congratulations, you now know some of the basics when working with MongoDB!

Monday, November 11, 2019

Laravel development environment under Ubuntu 19.10

This guide is on how to install the Laravel framework under Ubuntu and set up a local development environment. 

Reference: Practical Ubuntu Linux Server for beginners

First, we will install Apache & MariaDB:
sudo apt install apache2 mariadb-server mariadb-client
then we will setup default password for the MySQL installation:
sudo mysql_secure_installation
Next, we will log in inside MySQL with: sudo mysql -u root
to create our initial database:
create database laravel;
and create a user for the Laravel installation: laravel with password: password
CREATE USER 'laravel'@'%' IDENTIFIED BY 'password';
at the same time will and grant privileges to the user to all databases:
GRANT ALL PRIVILEGES ON *.* TO 'laravel'@'%' ;
Then we will restart the MariaDB server to activate the changes:
sudo systemctl restart mariadb.service
Now is time to install PHP support for Apache and extensions for Laravel with:
sudo apt install php libapache2-mod-php php-common php-mbstring php-xmlrpc php-soap php-gd php-xml php-mysql php-cli php-zip

optionally we can set limits inside php.ini
sudo nano /etc/php/7.3/apache2/php.ini
memory_limit = 256M
upload_max_filesize = 64M
cgi.fix_pathinfo=0


Next, we will install Curl for the composer to be able to run:
sudo apt install curl
Then to install composer we can use:
curl -sS https://getcomposer.org/installer | sudo php -- --install-dir=/usr/local/bin --filename=composer


/*
only if we want to use the laravel command:
let's update the local path to be able to access composer vendor binaries and particularly being able to run laravel:
export PATH = "$HOME/.config/composer/vendor/bin:$PATH"
if you want the path change to be persistent just add the line into the .bashrc file.
*/

It is time to create our project:
cd /var/www/html/
sudo composer create-project laravel/laravel --prefer-dist 
In order for Laravel artisan, Apache and our user to be able to access, read and write to the framework will need to fix the ownership and the permissions of the installation:
we set www-data as owner and group inside the Laravel installation :

sudo chown www-data:www-data /var/www/html/laravel/ -R
Next, we will make sure that all the existing as well as the newly created files will have rwx permissions and will also belong to the www-data group:
sudo chmod +770 /var/www/html/laravel -R
sudo setfacl -d -m g:www-data:rwx /var/www/html/
All that is left is to add our current user to the www-data group:
sudo usermod -a -G www-data $USER
and will switch the current user context to the www-data group:
newgrp www-data

Let's set up the Apache web-server to serve Laravel:
disabling default Apache site configuration
sudo a2dissite 000-default.conf
enable nice URLs:
sudo a2enmod rewrite

create configuration for laravel:
sudo nano /etc/apache2/sites-available/laravel.conf
  
 ServerName laravel.local
    ServerAdmin webmaster@localhost
    DocumentRoot /var/www/html/laravel/public
   
        AllowOverride All
   

    ErrorLog ${APACHE_LOG_DIR}/error.log
    CustomLog ${APACHE_LOG_DIR}/access.log combined


enable the laravel configuration site:
sudo a2ensite laravel.conf


Installing PHPMyAdmin
sudo apt install php-curl
sudo composer create-project phpmyadmin/phpmyadmin

create configuration for phpmyadmin:
sudo nano /etc/apache2/sites-available/phpmyadmin.conf
   
 ServerName phpmyadmin.local
    ServerAdmin webmaster@localhost
    DocumentRoot /var/www/html/phpmyadmin
   
        AllowOverride All
   

    ErrorLog ${APACHE_LOG_DIR}/error.log
    CustomLog ${APACHE_LOG_DIR}/access.log combined

enable the phpmyadmin configuration site:
sudo a2ensite laravel.conf

In order to activate changes we restart the Apache server:
systemctl restart apache2

Because both of the local domains will be accessible through the same IP 127.0.01 we will add them in the /etc/hosts file:
sudo nano /etc/hosts
127.0.0.1       laravel.local
127.0.0.1       phpmyadmin.local
(now we can browse those two host entries inside a browser)
http://laravel.local
http://phpmyadmin.local


Now let's fix some warnings. To run composer without the need of sudo, we will run:
sudo chown -R $USER ~/.config/composer/vendor/bin/ -R
this will give permissions of our user over the composer executable.

Database setup
open up .env and set our details taken from the mariadb setup:
DB_DATABASE=laravel
DB_USERNAME=laravel
DB_PASSWORD=password


Development helping utilities
We will help VSCode to recognize methods inside facades and models:

composer require --dev barryvdh/laravel-ide-helper
php artisan ide-helper:generate
php artisan ide-helper:meta
php artisan ide-helper:models --nowrite
-> in order to place the generated phpdoc to a separate file
in order when you have updates/changes inside the versions of the models/packages/facades your IDE to auto-regenerate the files, inside composer.json scripts place:
"post-update-cmd": [
        "Illuminate\\Foundation\\ComposerScripts::postUpdate",
        "@php artisan ide-helper:generate",
        "@php artisan ide-helper:meta",

        "@php artisan ide-helper:models --nowrite"
]


Navigation inside Visual Studio Code
inside /var/www/html/laravel we start the editor:
code .
Inside VSCode:
Press alt + c to toggle case sensitive highlighting and install the following extensions:
Laravel blade snippets, Laravel extra IntelliSense, Laravel goto view, PHP debug, PHP intelephense, PHP namespace resolver. (and optional: Local History)
Go to user settings" > "extensions" > "blade Configuration" use formatting
Notes on general navigation and debugging through code when developing:
With Ctrl+click we can navigate through locations such as controllers, classes, and blade templates. To return back to the previous location just press: Ctrl+Alt+-.
To search inside of all files inside the project we can use: CTRL+SHIFT+F
In order to list all available classes, methods and properties inside of a file just use:
CTRL+Shift+O
beforehand is needed from preferences to disable the integrated suggestions for php: "php.suggest.basic": false

We will install debugbar for in-browser debugging purposes:
sudo composer require barryvdh/laravel-debugbar --dev
Debugbar also very nicely displays the current ones used by our request while browsing. To debug inside the code, we can also use: dd('what we want to debug'); In case we have lots of variables just place them inside of an array:
$test[]=$var1;
$test[]=$var2;
and then dd($test); will output them all. Try also dd(__METHOD__); It will give you the invocation method location and name.
For adding/removing multiple lines comments toggle: Ctrl + /
in case you would like to know all the available routes, you can use: php artisan route:list

Let's go to the frontend part:
Let's install npm, here is how to get its latest version of:
curl https://www.npmjs.com/install.sh | sudo sh
Now follows the latest version of nodejs from nodesource:
from nodesource
curl -sL https://deb.nodesource.com/setup_13.x | sudo -E bash -
sudo apt-get install -y nodejs

We will require the user interface library and will place it inside as a development dependency:
composer require laravel/ui --dev
Now we add VUE as user interface framework (if you don't want to create the default authentication just omit --auth)
php artisan ui vue --auth
We will install the required laravel UI package dependencies using npm, and when we have them we will compile a development version of all the front-end JavaScript and CSS styles:
npm install && npm run dev
In case of problems during the packages' installation we will do:
npm cache clean --force
and then repeat the last build with npm install && npm run dev
(if you look at the contents of webpack.mix.js, you will see that they point from /resources/ to /public/ directory, so when running npm run dev, actually we grab and compile those resources and place the output inside /public/css and public/js directories)

After our DB connection is setup we can do: php artisan migrate to create our first migration. This is important because this migration will create tables for authentication. When the migration completes we can test the authentication, by creating a user and logging inside the laravel system.

Now we will discuss the simple Laravel flow. In order to protect certain route such as /home, we can use:
$this->middleware('auth');
You can see the function inside the constructor of this controller: public function __construct(){$this->middleware('auth');}
At the same time, how do we know when we type /home in our browser that exactly App/Http/Controllers/HomeController.php will be loaded?
well lets enter inside of /routes directory: web.php
let's focus on the line:
Route::get('/home', 'HomeController@index')->name('home');
which says: if we request /home the HomeController with method index will be loaded.
Then if we follow HomeController.php index.php we see:
public function index(){return view('home');}
return view('home') - simply means load up home.blade.view - this is where our HTML template resides. Go ahead make some changes inside and reload the /home URL to see them live:
Inside home.blade.php we can place:
@auth
You are logged in!
{{Auth::user()->name}}
@else
still not logged in.
@endauth
and inside welcome.blade.php:
@guest
Hi guest!
@endguest
Just save and see the change!

Congratulations!

Tuesday, November 05, 2019

Install PHP, MySql and Apache inside Ubuntu 19.10

The article is to show how the installation of PHP, MySQL, and Apache on Ubuntu 19.10 in 6 easy steps. It can be useful when performing system administration or start learning web development. Here is a video on the subject:

1. We will start with the Apache webserver:
sudo apt install apache2
which will install and start the Apache on port 80
test with pointing the browser to http:://localhost

2. Next, let's fix some permissions and ownership:
go to cd /var/www/ and if you type ls -la, you will see that all the files and directories are owned by and belong to root:root. Let's fix this in order to have access to files inside this directory. First, we will put out current user into the www-data group
sudo usermod -a -G www-data:$USER
and then with sudo chown www-data:www-data /var/www -R
we will recursively set all the files inside /var/www to belong to www-data which our user just became a member of. Now check the result with ls -la.

After the ownership, we will take care of the files and directories permissions. We will set them with:
chmod +0770 /var/www -R
With this line, we set read-write-execute to the owner and the group, in order for all the files and directories as well as the newly created files and directories to inherit those permissions.

3. Editor
Install the Visual Studio Code:
sudo apt install code
then inside the /var/www directory type code .
Create a file index.php with the following content:
<?php
phpinfo();
?>
with nano index.php

4. PHP
now it is time to add a way for Apache to interpret PHP code:
sudo apt install apache-php7.3
then restart the Apache server with sudo systemctl restart apache2
Point your browser again to http://localhost/index.php and you should be able to see the information from the phpinfo() function;

5. MySQL server
sudo apt install mysql-server
sudo mysql_secure_installation where please set a root password!
mysql -uroot -p (enter the password generated in the previous step)
when ready just type: use mysql; select plugin from user where User = 'root';
In the resulting table you should see: mysql_native_password; If not please type:
Alter user 'root'@'localhost'  identified with mysql_native_password by 'mysql'; Here we set MySQL as password and mysql_native_password as an authentication method in order to be able to use and login to MySQL databases inside of our applications;
followed by: flush privileges; to be able to have the changes accepted;

6. PHP-MySql connection
Exit the MySQL prompt and type:
sudo apt install php7.3-mysql
and again restart the Apache server with sudo systemctl restart apache2
Now paste the following code inside the index.php file and run again index.php in the browser:

<?php
$servername = "localhost";
$username = "root";
$password = "mysql";
try {
$conn = new PDO("mysql:host=$servername;dbname=mysql", $username, $password);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
echo "Connected successfully";
}
catch(PDOException $e)
{
echo "Connection failed: " . $e->getMessage();
}
?>
You should be able to see: Connected successfully!
Congratulations!

Sunday, November 03, 2019

Install NodeJS and Angular on Windows 10 WSL2

Let's see how under Windows Subsystem for Linux (WSL 2) we can setup NodeJs, Npm and install Angular. So that we can later do our web development projects or trying examples from Angular courses. You can also watch the video on the installation.



We will first enable WSL 2 in order to be able to support and load Linux systems:
launch PowerShell by typing: powershell and with right click run it in administrative mode. Paste the following content which will enable WSL as well as the virtual machine platform:
Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux 
Enable-WindowsOptionalFeature -Online -FeatureName VirtualMachinePlatform
wsl --set-default 2The last line will set the 2nd more performant version of WSL as a default when running OS such as Ubuntu.

Next, we will go to Microsoft's store, download and launch the Ubuntu When launching the application, you might be prompted to restart your computer. Then again try to Launch the ubuntu application by just typing ubuntu. It will ask you to set up a default user and password so you can access the Ubuntu system.

Now it is time to update the local distribution packages with:
sudo apt update && sudo apt dist-upgrade

Installing Angular
Since Ubuntu version provided in WSL is not the latest one, we will go to https://github.com/nodesource/distributions
and then install the latest available node version
curl -sL https://deb.nodesource.com/setup_13.x | sudo -E bash - sudo apt-get install -y nodejs


We are ready to install the Angular CLI:
sudo npm i -g @angular/cli
(we install the package globally, so to be able to execute ng command from anywhere inside our system)
Now we can type ng new_project followed by cd new_project and ng serve
You can browse the newly created project under http://localhost:4200

Congratulations!

Subscribe To My Channel for updates

Modernizing old php project with the help of AI

0. Keep docker running in separate outside of VSCODE terminal 1. The importance of GIT for version control - create modernization branch 2. ...