I have a function written in SQL and I would like to write it in Spark how should I proceed.

Thanks.

The function is like this:

CREATE OR REPLACE FUNCTION grid(

widthx double precision,

heighty double precision,

spx double precision,

spy double precision,

orientation double precision)

RETURNS void AS

$BODY$

DECLARE

n integer;

m integer;

d double precision;

BEGIN

n := trunc( heightY / spY ) + 1;

m := trunc( widthX / spX ) + 1;

d = sqrt(spX^2+spY^2);

beta = degrees(atan(spY/spX));

rowGrid := 1;

DELETE FROM WHERE ;

DELETE FROM WHERE ;

DELETE FROM WHERE ;

WHILE n-1 > 0 LOOP

columnGrid := 1;

WHILE m-1 > 0 LOOP

point1 :=

point2 :=

point3 :=

point4 :=

execute saveBin( );

EXIT WHEN m-1 < 0;

END LOOP;

END;

Similar questions and discussions