Tuesday, 31 January 2017

Postgresql ,Basic steps in Ubuntu by usingTerminal

1) To open  in Terminal
   $sudo -I (i) -u postgres psql

2)To check the database available
postgres=#\l

3)To create  database
      postgres=#create database manju;
           ( manju is db name).

4)To connect the db.
     postgres=#\c manju;

5)Creating Schema.
    postgres=#create schema publicschema;

6)To check the DataType
  postgres=#/d

7)create Table.
   postgres=#table publicschema.table1(id integer, password CHAR(10));

8)To check the catalog.
    postgres=#select* from pg_catalog.pg_tables;
 
     postgres=# select * from pg_catalog.pg_tables where schema!= 'information_schema' and schema name!='pg_catalog';


9)Insert values
     postgres=# into public.manju(1,'1');

9)To view the table
   postgres=#select * from public.manju;

10)To Truncate .
     postgres=#truncate public.manju;  
 
11)To Delete
      postgres=#delete from public.manju;

12)To Drop.
    postgres=#drop table public.manju;

Wednesday, 11 January 2017

Scala and spark Installation

                                       Scala & spark Installation

Step 1: Verifying Java Installation
Java installation is one of the mandatory things in installing Spark. Try the following command to verify the JAVA version.
$java -version 
If Java is already, installed on your system, you get to see the following response −
java version "1.7.0_71" 
Java(TM) SE Runtime Environment (build 1.7.0_71-b13) 
Java HotSpot(TM) Client VM (build 25.0-b02, mixed mode)
In case you do not have Java installed on your system, then Install Java before proceeding to next step.
Step 2: Verifying Scala installation
You should Scala language to implement Spark. So let us verify Scala installation using following command.
$scala -version
If Scala is already installed on your system, you get to see the following response −
Scala code runner version 2.11.6 -- Copyright 2002-2013, LAMP/EPFL
In case you don’t have Scala installed on your system, then proceed to next step for Scala installation.
Step 3: Downloading Scala
Download the latest version of Scala by visit the following link scala For this tutorial, we are using scala-2.11.6 version. After downloading, you will find the Scala tar file in the download folder.
Step 4: Installing Scala
Follow the below given steps for installing Scala.
Extract the Scala tar file
Type the following command for extracting the Scala tar file.
$ tar xvf scala-2.11.6.tgz
Move Scala software files
Use the following commands for moving the Scala software files, to respective directory (/usr/local/scala).
$ su – 
Password: 
# cd /home/Hadoop/Downloads/ 
# mv scala-2.11.6 /usr/local/scala 
# exit 
Set PATH for Scala
Use the following command for setting PATH for Scala.
$ export PATH=$PATH:/usr/local/scala/bin
Verifying Scala Installation
After installation, it is better to verify it. Use the following command for verifying Scala installation.
$scala -version
Step 5: Downloading Apache Spark
Download the latest version of Spark by visiting the following link  For this tutorial, we are using spark-1.3.1-bin-hadoop2.6 version. After downloading it, you will find the Spark tar file in the download folder.
Step 6: Installing Spark
Follow the steps given below for installing Spark.
Extracting Spark tar
The following command for extracting the spark tar file.
$ tar xvf spark-1.3.1-bin-hadoop2.6.tgz 
Moving Spark software files
The following commands for moving the Spark software files to respective directory (/usr/local/spark).
$ su – 
Password:  

# cd /home/Hadoop/Downloads/ 
# mv spark-1.3.1-bin-hadoop2.6 /usr/local/spark 
# exit 
Setting up the environment for Spark
Add the following line to ~/.bashrc file. It means adding the location, where the spark software file are located to the PATH variable.
export PATH = $PATH:/usr/local/spark/bin
Use the following command for sourcing the ~/.bashrc file.
$ source ~/.bashrc
Step 7: Verifying the Spark Installation
Write the following command for opening Spark shell.
$spark-shell
If spark is installed successfully then you will find the following output.
Spark assembly has been built with Hive, including Datanucleus jars on classpath 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
15/06/04 15:25:22 INFO SecurityManager: Changing view acls to: hadoop 
15/06/04 15:25:22 INFO SecurityManager: Changing modify acls to: hadoop
15/06/04 15:25:22 INFO SecurityManager: SecurityManager: authentication disabled;
   ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop) 
15/06/04 15:25:22 INFO HttpServer: Starting HTTP Server 
15/06/04 15:25:23 INFO Utils: Successfully started service 'HTTP class server' on port 43292. 
Welcome to 
      ____              __ 
     / __/__  ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/  '_/ 
   /___/ .__/\_,_/_/ /_/\_\   version 1.4.0 
      /_/  
                
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_71) 
Type in expressions to have them evaluated. 
Spark context available as sc  

scala> 

Monday, 9 January 2017

How To Install and Use PostgreSQL on Ubuntu

How To Install and Use PostgreSQL on Ubuntu

 Since we haven't updated our local apt repository lately, let's do that now. We can then get the Postgres package and a "contrib" package that adds some additional utilities and functionality:

$sudo apt-get update 
$sudo apt-get install postgresql postgresql-contrib

 The installation procedure created a user account called postgres that is associated with the default Postgres role. In order to use Postgres, we'll need to log into that account. You can do that by typing:

$sudo -i -u postgres


 You will be asked for your normal user password and then will be given a shell prompt for the postgres user.

You can get a Postgres prompt immediately by typing:

$psql 

Exit out of the PostgreSQL prompt by typing: 

$\q

            PostgresSQL In UI Mod


 Step 1 - Installing PostgreSQL, phpPgAdmin and Apache2

PostgreSQL and PhpPgAdmin are available in the Ubuntu repository. So you just need to install them with the apt command.  


$sudo apt-get -y install postgresql postgresql-contrib phppgadmin
Step 2 - Configure PostgreSQL user
 PostgreSQL uses role for user authentication and authorization, it just like Unix-Style permissions. By default, PostgreSQL creates a new user called "postgres" for basic authentication. To use PostgreSQL, you need to login to the "postgres" account, you can do that by typing:
  $sudo su
  $su - postgres
Now you can access the PostgreSQL prompt with the command:
$psql
And then change the password for postgres role by typing:
$\password postgres
ENTER YOUR PASSWORD
Then enter \q to leave the psql command line.
Run the command "exit" to leave the postgres user and become root again.
$exit
Step 3 - Configure Apache2 
You need to configure apache for phpPgAdmin. Edit the file /etc/apache2/conf-available/phppgadmin.conf with nano by typing:
$cd /etc/apache2/conf-available/
nano phppgadmin.conf 
Comment out the line #Require local by adding a # in front of the line and add below the line allow from all so that you can access from your browser.

 Step 4 - Configure phpPgAdmin
 Edit the file /etc/phppgadmin/config.inc.php by typing : 
$cd /etc/phppgadmin/
 nano config.inc.php
Find the line $conf['extra_login_security'] = true; and change the value to false so you can login to phpPgAdmin with user postgres. 
Step 5 - Restart PostgreSQL and Apache2
$systemctl restart postgresql
$systemctl restart apache2

Step 6 - Testing
Now access phpPgAdmin with your browser http://yourip/phppgadmin/. 

and then try login to with user postgres and your password.
After logging in you will get this interface:



 

Pentaho Installation

                 Pentaho Installation  For Ubuntu





Step 1:Download:Pentoho
            From http://community.pentaho.com/projects/data-integration/  

Step 2: Go to Terminal
  Snipe@ubuntu:~$ cd /home/snipe/Desktop/data-integration/spoon.sh 
  snipe@ubuntu:~$ ./spoon.sh