EN VI
Posts (0)
Questions (3)
2024-03-11 11:00:05
Check out this solution: import pyspark.sql.functions as f df = spark.createDataFrame([ ('"aci*credit one bank, n"', ' '), ('odot dmv2u 503-9455400 or 06/30', ' '), ('# 7-eleven 4106...
Tags: arrays string pyspark
2024-03-11 11:30:04
You can use map_filter function, like below. df .withColumn( "filtered_map", expr("size(map_filter(Maptype_col, (k, v) -> v > 1 ))") ) +-------------+------------------------+------------+ |...
2024-03-12 20:30:09
Just add sum window function. you will get expected result. w=Window.orderBy("week") df2 = df.withColumn('value1', f.when((f.col('week') > 1),...
Tags: pyspark

Login


Forgot Your Password?

Create Account


Lost your password? Please enter your email address. You will receive a link to create a new password.

Reset Password

Back to login