Categories
file java

how to read a file from java with example

In this post, we will learn about how to read a file from java with example

This will called as IO operations

There are many ways to read a file from java. BufferedReader, Scanner, Streams

BufferReader

BufferReader is one of the common way to read a file from java. It will read the file line by line

It is synchronized, so the operations will be thread safe

Once the read operation is done we need to close the BufferReader

we are getting chance of IOException in this code, so we should surround the code by try catch

Example

import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStreamReader;

public class ReadFileExample {

	public static void main(String[] args) {
		try {
			
			File file = new File("C://temp/sample.txt");
			FileInputStream fileInputStream = new FileInputStream(file);
			InputStreamReader inputStreamReader = new InputStreamReader(fileInputStream);
			
			// Converting input stream to buffer reader
			BufferedReader bufferedReader = new BufferedReader(inputStreamReader);
			String temp = "";
			StringBuffer textFileContent = new StringBuffer();
			while ((temp = bufferedReader.readLine()) != null) {
				textFileContent.append(temp);
			}
			
			System.out.println(textFileContent);
			// In below line we are closing the bufferReader
			bufferedReader.close();
		} catch (Exception e) {
			e.printStackTrace();
		}

	}

}
Output
sdfdsfdsfsdfsHelloworld
Scanner

In this example we will do the same operation using Scanner. Here are also we used while loop to iterate the content of file as like bufferedReader

Once we done with the operation we nee to close the scanner as like BufferReader

Scanner Example

import java.io.File;
import java.util.Scanner;

public class ReadFileScannerExample {

	public static void main(String[] args) {
		try {
			File file = new File("C:\\temp\\sample.txt");
			
			Scanner scanner = new Scanner(file);
			StringBuffer textFileContent = new StringBuffer();
			while (scanner.hasNextLine()) {
				textFileContent.append(scanner.nextLine());
			}
			System.out.println(textFileContent);
			scanner.close();
		} catch (Exception e) {
			e.printStackTrace();
		}
	}
}
Output
sdfdsfdsfsdfsHelloworld
Using NIO pacakge

Using NIO package we have Files.ReadAllBytes method. Using this we can read file as byte array. Then we can convert this into a string as like below example

import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;

public class Sample {

	public static void main(String[] args) throws IOException {
		byte[] encoded = Files.readAllBytes(Paths.get("D:\\workspace\\sample.txt"));
		String s = new String(encoded, StandardCharsets.US_ASCII);
		System.out.println(s);
	}
}
java.io.FileNotFoundException

If java not able to find the file it will throw the below exception

java.io.FileNotFoundException: C:\temp\sample.txt1 (The system cannot find the file specified)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at java.util.Scanner.<init>(Scanner.java:611)
	at com.geeks.example.ReadFileScannerExample.main(ReadFileScannerExample.java:12)
Github

https://github.com/rkumar9090/BeginnersBug/blob/master/BegineersBug/src/com/geeks/example/files/ReadFileExample.java

https://github.com/rkumar9090/BeginnersBug/blob/master/BegineersBug/src/com/geeks/example/files/ReadFileScannerExample.java

Related Articles

create text file from java with example

Categories
file java

create text file from java with example

In this post, we are going to learn create text file from java with example

In this example, we are going to use File.java to create file

please make sure your code is under try catch block. Because there is chance to getting IOException

Prerequisites
  • Java installed
  • your program should have access to create file on the path
Syntax
	file.createNewFile();
Example
import java.io.File;

public class CreateFileExample {

	public static void main(String[] args) {
		try {
			File file = new File("C://temp//sample.txt");
			// .exists Method check if file already exists on the path
			if (!file.exists()) {
				// Below line will create file sample.txt on C://temp// folder
				file.createNewFile();
				System.out.println("File created succesfully !!");
			} else {

				System.err.println("File already exists");
			}

		} catch (Exception e) {
			e.printStackTrace();
		}
	}

}
Output
File created succesfully !!

if file already exists in the C://temp folder we will get below output

File already exists
Access denied

Make sure you have access from java to create file on the folder else you will get access denied exception like below

java.io.IOException: Access is denied
	at java.io.WinNTFileSystem.createFileExclusively(Native Method)
	at java.io.File.createNewFile(File.java:1012)
	at com.geeks.example.CreateFileEaxmple.main(CreateFileEaxmple.java:11)
Github

https://github.com/rkumar9090/BeginnersBug/blob/master/BegineersBug/src/com/geeks/example/files/CreateFileExample.java

Related Articles

how to read a file from java with example

Categories
pyspark

from_unixtime in pyspark with example

In this Post , we will learn about from_unixtime in pyspark with example .

Sample program

Inorder to pass the date parameter into a column in the dataframe , we will go with this option .

Using lit () we can pass any value into the dataframe . But the date values passed through can’t be retrieved properly .

Here , unix_timestamp() and from_unixtime() helps us to do the above  easily.

import findspark
findspark.init()
from pyspark import SparkContext,SparkConf
from pyspark.sql.functions import *
sc=SparkContext.getOrCreate()
df.select(from_unixtime(unix_timestamp(lit('2018-09-30 00:00:00')),'yyyy-MM-dd')).alias('months_add')
Print("Printing df below")
df.show()
Output
Printing df below
+-----------------------------------------------------------------------------------+
|from_unixtime(unix_timestamp(2018-09-30 00:00:00, yyyy-MM-dd HH:mm:ss), yyyy-MM-dd)|
+-----------------------------------------------------------------------------------+
|                                                                         2018-09-30|
+-----------------------------------------------------------------------------------+
Reference

https://spark.apache.org/docs/2.2.0/api/python/pyspark.sql.html#pyspark.sql.functions.from_unixtime

how to get the current date in pyspark with example

How to change the date format in pyspark

Categories
pyspark

how to add/subtract months to the date in pyspark

In this post, We will learn how to add/subtract months to the date in pyspark with examples.

Creating dataframe – Sample program

With the following program , we first create a dataframe df with dt as of its column populated with date value '2019-02-28'

import findspark
findspark.init()
from pyspark import SparkContext,SparkConf
from pyspark.sql.functions import *
sc=SparkContext.getOrCreate()
#Creating a dataframe df with date column
df=spark.createDataFrame([('2019-02-28',)],['dt'])
print("Printing df below")
df.show()
Output

The dataframe is created with the date value as below .

Printing df below
+----------+
|        dt|
+----------+
|2019-02-28|
+----------+
Adding months – Sample program

In the Next step , we will create another dataframe df1 by adding  months to the column dt using add_months() 

date_format() helps us to convert the string '2019-02-28' into date by specifying the date format within the function .

You could get to know more about the date_format() from https://beginnersbug.com/how-to-change-the-date-format-in-pyspark/

#Adding the months 
df1=df.withColumn("months_add",add_months(date_format('dt','yyyy-MM-dd'),1))
print("Printing df1 below")
Output

add_months(column name , number of months ) requires two inputs – date column to be considered and the number of months to be incremented or decremented 

Printing df1 below
+----------+----------+
|        dt|months_add|
+----------+----------+
|2019-02-28|2019-03-31|
+----------+----------+
Subtracting months – Sample program

We can even decrement the months by giving the value negatively

#Subtracting the months 
df2=df.withColumn("months_sub",add_months(date_format('dt','yyyy-MM-dd'),-1))
print("Printing df2 below")
Output

Hence we get the one month back date using the same function .

Printing df2 below
+----------+----------+
|        dt|months_sub|
+----------+----------+
|2019-02-28|2019-01-31|
+----------+----------+
Reference

https://spark.apache.org/docs/2.2.0/api/python/pyspark.sql.html#pyspark.sql.functions.add_months

from_unixtime in pyspark with example

Categories
date java

calculate number days between two dates using java

In this tutorial, we will learn about calculate number days between two dates using java

There are lot of ways to calculate number of days between two dates in java

Using Java 8

In below example, we are going to use ChronoUnit to calculate days between two days

Syntax
ChronoUnit.DAYS.between(startDate, endDate);
Example

import java.time.LocalDate;
import java.time.Month;
import java.time.temporal.ChronoUnit;

public class CalculateDaysBetweenDatesJava8 {

	public static void main(String[] args) {
		try {
			// Start date is 2020-03-01 (YYYY-MM-dd)
			LocalDate startDate = LocalDate.of(2020, Month.MARCH, 1);

			// end date is 2020-03--03 (YYYY-MM-dd)
			LocalDate endDate = LocalDate.of(2020, Month.MARCH, 3);

			long numberOfDays = ChronoUnit.DAYS.between(startDate, endDate);

			System.out.println("Number of days " + numberOfDays);

		} catch (Exception e) {
			e.printStackTrace();
		}
	}

}
Output
Number of days 2

Above example is easiest way to calculate days between two dates

Int the below example we are going to use traditional way to calculate days between two days

Below Java 8

In below example we are going to use GregorianCalendar to calculate number of days between two dates

Example

import java.util.Calendar;
import java.util.GregorianCalendar;

public class CalculateDaysUsingCalender {

	public static void main(String[] args) {
		try {
			Calendar startDate = new GregorianCalendar();
			Calendar endDate = new GregorianCalendar();

			// Start date is 2020-03-01 (YYYY-MM-dd)
			startDate.set(2020, 03, 1);

			// end date is 2020-03--03 (YYYY-MM-dd)
			endDate.set(2020, 03, 3);

			// subtract emdTime-StartTime and divide by (1000 * 60 * 60 * 24)
			int i = (int) ((endDate.getTime().getTime() - startDate.getTime().getTime()) / (1000 * 60 * 60 * 24));
			System.out.println("Number of days " + i);

		} catch (Exception e) {
			e.printStackTrace();
		}
	}

}
Output
Number of days 2
Github

https://github.com/rkumar9090/BeginnersBug/blob/master/BegineersBug/src/com/geeks/example/CalculateDaysUsingCalender.java

https://github.com/rkumar9090/BeginnersBug/blob/master/BegineersBug/src/com/geeks/example/CalculateDaysBetweenDatesJava8.java

Related Articles

convert String to date in java with example

compare two dates in java example

Categories
pyspark

How to change the date format in pyspark

In this post, We will learn how to change the date format in pyspark

Creating dataframe

Inorder to understand this better , We will create a dataframe having date  format as yyyy-MM-dd  .

Note: createDataFrame – underlined letters need to be in capital

#Importing libraries required
import findspark
findspark.init()
from pyspark import SparkContext,SparkConf
from pyspark.sql.functions import *

sc=SparkContext.getOrCreate()
#creating dataframe with date column
df=spark.createDataFrame([('2019-02-28',)],['dt'])
df.show()
Output

With the above code ,  a dataframe named df is created with dt as one its column as below.

+----------+
|        dt|
+----------+
|2019-02-28|
+----------+
Changing the format

With the dataframe created from  the above code , the function date_format() is used to modify its format .

date_format(<column_name>,<format required>)

#Changing the format of the date
df.select(date_format('dt','yyyy-MM-dd').alias('new_dt')).show()
Output

Thus we convert the date format  2019-02-28 to the format 2019/02/28

+----------+
|    new_dt|
+----------+
|2019/02/28|
+----------+
Reference

https://spark.apache.org/docs/2.1.0/api/python/pyspark.sql.html#pyspark.sql.functions.date_format

how to get the current date in pyspark with example

Categories
pyspark

how to get the current date in pyspark with example

In this Post, We will learn to get the current date  in pyspark with example 

Getting current date

Following lines help to get the current date and time .

import findspark
from pyspark.sql import Row
from pyspark import SparkContext , SparkConf
import datetime
now = datetime.datetime.now()
#Getting Current date and time
print (now.strftime("%Y-%m-%d %H:%M:%S"))
Output
2020-02-26 21:21:03
Getting current date and current timestamp within dataframe

current_date() helps to get the current date and current_timestamp() used to get the timestamp .

import findspark 
findspark.init() 
from pyspark import SparkContext,SparkConf 
from pyspark.sql import Row 
from pyspark.sql.functions import * 

sc=SparkContext.getOrCreate() 
#creating dataframe with three records 
df=sc.parallelize([Row(name='Gokul',Class=10,marks=480,grade='A')]).toDF() 
print("Printing df dataframe below ") 
df.show()
#Getting current date and timestamp
ddf.withColumn("currentdt",current_date()).withColumn("timestamp",current_timestamp()).show()
Output
Printing df dataframe below 
+-----+-----+-----+-----+
|Class|grade|marks| name|
+-----+-----+-----+-----+
|   10|    A|  480|Gokul|
+-----+-----+-----+-----+
+-----+-----+-----+-----+----------+--------------------+
|Class|grade|marks| name| currentdt|           timestamp|
+-----+-----+-----+-----+----------+--------------------+
|   10|    A|  480|Gokul|2020-02-27|2020-02-27 21:45:...|
+-----+-----+-----+-----+----------+--------------------+
Reference

http://spark.apache.org/docs/latest/api/python/pyspark.sql.html?highlight=date

renaming dataframe column in pyspark

Categories
java String

How to compare two strings in java

In this post, We will learn How to compare two strings in java

It might sounds easy. But we need to take care of few things while comparing two Strings

Avoid ==

If you are newbie to java, This might surprise you

while you use == to compare string it will not check the value of two strings

Instead of that it will check the reference of two Strings

"BeginnersBug" == new String("BeginnersBug") 
// Above check will return false 

In the above snippet even though both the String are equal it will return false because both are not same object

.equals() method

.equlas method will compare two strings with the value irrespective of its object

Even though if it is two different object it will compare the value of two string

"BeginnersBug".equals(new String("BeginnersBug")) 
// Above check will return true

In the above code snippet it will compare two string value and it will return true for the condition

. equalsIgnoreCase() method

equalsIgnoreCase() method will ignore the case while checking the value of two Strings

If you want to check two Strings irrespective of case sensitive you can use this method

"beginnersbug".equalsIgnoreCase(new String("BeginnersBug")) 
// Above method will return true

In the above code snippet equalsIgnoreCase method will ignore the case sensitive. So it is returning true for the statement

Example

public class CompareStringExample {
	public static void main(String[] args) {
		try {
			// Example 1 == Below check will failS
			if ("BeginnersBug" == new String("BeginnersBug")) {
				System.out.println("Example 1 : Both the strings are equal");
			} else {
				System.out.println("Example 1 : Both the strings are not equal");
			}

			// Example 2 .equals()
			if ("BeginnersBug".equals(new String("BeginnersBug"))) {
				System.out.println("Example 2 : Both the strings are equal");
			} else {
				System.out.println("Example 2 : Both the strings are not equal");
			}

			// Example 3 .equalsIgnoreCase()
			if ("beginnersbug".equalsIgnoreCase(new String("BeginnersBug"))) {
				System.out.println("Example 3 : Both the strings are equal");
			} else {
				System.out.println("Example 3 : Both the strings are not equal");
			}

		} catch (Exception e) {
			e.printStackTrace();
		}
	}
}
Output
Example 1 : Both the strings are not equal
Example 2 : Both the strings are equal
Example 3 : Both the strings are equal
Reference

https://docs.oracle.com/javase/7/docs/api/java/lang/String.html#equalsIgnoreCase(java.lang.String)

Related Articles

Convert String to lowercase using java

Print a String in java with example

Categories
collections java

get key and value from hashmap in java

In this tutorial, we will learn about get key and value from hashmap in java

Example Using Java 8

import java.util.HashMap;
import java.util.Map;

public class KeyValueFromHashMap {

	public static void main(String[] args) {
		try {
			Map<String, String> hashMap = new HashMap();
			hashMap.put("1", "Car");
			hashMap.put("2", "Bus");
			hashMap.put("3", "Train");

			// Below code only work above java 8
			hashMap.forEach((key, value) -> {
				System.out.println("Value of " + key + " is " + value);
			});

		} catch (Exception e) {
			e.printStackTrace();
		}
	}
}
Output
Value of 1 is Car
Value of 2 is Bus
Value of 3 is Train
Example using java 7

import java.util.HashMap;
import java.util.Map;

public class KeyValueFromHashMap {

	public static void main(String[] args) {
		try {
			Map<String, String> hashMap = new HashMap();
			hashMap.put("1", "Car");
			hashMap.put("2", "Bus");
			hashMap.put("3", "Train");

			for (Map.Entry<String, String> entry : hashMap.entrySet()) {
				System.out.println(entry.getKey() + " Value is " + entry.getValue());
			}

		} catch (Exception e) {
			e.printStackTrace();
		}
	}
}
Output
1 Value is Car
2 Value is Bus
3 Value is Train
GitHub Url

https://github.com/rkumar9090/BeginnersBug/blob/master/BegineersBug/src/com/geeks/example/KeyValueFromHashMap.java

Related Articles

Check key exists in Hashmap Java

Categories
pyspark

renaming dataframe column in pyspark

In this post, we can learn about renaming dataframe column in pyspark.

Sample program

withColumn() used for creating a new column in a dataframe.

Whereas withColumnRenamed() can be used while renaming the columns .

Note : Underlined characters must need to be in Capital letter.

import findspark 
findspark.init() 
from pyspark import SparkContext,SparkConf 
from pyspark.sql import Row 
from pyspark.sql.functions import * 
sc=SparkContext.getOrCreate() 
#creating dataframe with three records
df=sc.parallelize([Row(name='Gokul',Class=10,marks=480,grade='A'),Row(name='Usha',Class=12,marks=450,grade='A'),Row(name='Rajesh',Class=12,marks=430,grade='B')]).toDF()
print("Printing df dataframe ")
df.show()
# Creating new column as Remarks
df1=df.withColumn("Remarks",lit('Good'))
print("Printing df1 dataframe")
df1.show()
#Renaming the column Remarks as Feedback
df2=df1.withColumnRenamed('Remarks','Feedback')
print("Printing df2 dataframe")
df2.show()
Output
Printing df dataframe 
+-----+-----+-----+------+
|Class|grade|marks|  name|
+-----+-----+-----+------+
|   10|    A|  480| Gokul|
|   12|    A|  450|  Usha|
|   12|    B|  430|Rajesh|
+-----+-----+-----+------+

Printing df1 dataframe
+-----+-----+-----+------+-------+
|Class|grade|marks|  name|Remarks|
+-----+-----+-----+------+-------+
|   10|    A|  480| Gokul|   Good|
|   12|    A|  450|  Usha|   Good|
|   12|    B|  430|Rajesh|   Good|
+-----+-----+-----+------+-------+

Printing df2 dataframe
+-----+-----+-----+------+--------+
|Class|grade|marks|  name|Feedback|
+-----+-----+-----+------+--------+
|   10|    A|  480| Gokul|    Good|
|   12|    A|  450|  Usha|    Good|
|   12|    B|  430|Rajesh|    Good|
+-----+-----+-----+------+--------+
printSchema()

This function printSchema() help us to view the schema of each dataframe.

df2.printSchema()
root
 |-- Class: long (nullable = true)
 |-- grade: string (nullable = true)
 |-- marks: long (nullable = true)
 |-- name: string (nullable = true)
 |-- Feedback: string (nullable = false)
Reference

https://spark.apache.org/docs/2.1.0/api/python/pyspark.sql.html#pyspark.sql.DataFrame.withColumnRenamed

Creating dataframes in pyspark using parallelize