pyspark is not working Exception in thread “main” java.lang.ClassCast

i have tried various methods but this seems to be the only output please help me rectify this

(pyspark_env) teja@teja-Vostro-15-3568:~$ pyspark
Python 3.7.4 (default, Aug 13 2019, 20:35:49) 
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
2019-09-30 01:56:05,725 WARN util.Utils: Your hostname, teja-Vostro-  15-3568 resolves to a loopback address: 127.0.1.1; using 192.168.0.101 instead   (on interface wlp1s0)
2019-09-30 01:56:05,726 WARN util.Utils: Set SPARK_LOCAL_IP if you need to  bind to another address
Exception in thread "main" java.lang.ClassCastException: org.apache.xerces.dom.DeferredElementNSImpl cannot be cast to org.w3c.dom.Text
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2604)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2492)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2405)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1143)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1115)
at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopConfigurations(SparkHadoopUtil.scala:464)
at org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:436)
at org.apache.spark.deploy.SparkSubmit$$anonfun$2.apply(SparkSubmit.scala:323)
at org.apache.spark.deploy.SparkSubmit$$anonfun$2.apply(SparkSubmit.scala:323)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:323)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:774)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
File "/home/teja/spark-2.4.4-bin-hadoop2.7/python/pyspark/shell.py", line 38, in 
SparkContext._ensure_initialized()
File "/home/teja/spark-2.4.4-bin-hadoop2.7/python/pyspark/context.py", line 316, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "/home/teja/spark-2.4.4-bin-hadoop2.7/python/pyspark/java_gateway.py", line 46, in launch_gateway
return _launch_gateway(conf)
File "/home/teja/spark-2.4.4-bin-hadoop2.7/python/pyspark/java_gateway.py", line 108, in _launch_gateway
raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number

.bashrc
export HADOOP_HOME=$HOME/hadoop-3.2.1
export HADOOP_CONF_DIR=$HOME/hadoop-3.2.1/etc/hadoop
export HADOOP_MAPRED_HOME=$HOME/hadoop-3.2.1
export HADOOP_COMMON_HOME=$HOME/hadoop-3.2.1
export HADOOP_HDFS_HOME=$HOME/hadoop-3.2.1
export YARN_HOME=$HOME/hadoop-3.2.1
export PATH=$PATH:$HOME/hadoop-3.2.1/bin

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin:$PATH

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export SPARK_HOME=~/spark-2.4.4-bin-hadoop2.7
export PATH=$SPARK_HOME/bin:$PATH
export PYSPARK_PYTHON=python3.7

i also have hadoop3.2.1 installed alongside