@@ -24,7 +24,7 @@ a total order in the output.
24
24
25
25
### Syntax
26
26
{% highlight sql %}
27
- ORDER BY { expression [ sort_direction | nulls_sort_oder ] [ , ...] }
27
+ ORDER BY { expression [ sort_direction | nulls_sort_oder ] [ , ... ] }
28
28
{% endhighlight %}
29
29
30
30
### Parameters
@@ -38,7 +38,7 @@ ORDER BY { expression [ sort_direction | nulls_sort_oder ] [ , ...] }
38
38
<dd >
39
39
Optionally specifies whether to sort the rows in ascending or descending
40
40
order. The valid values for the sort direction are <code>ASC</code> for ascending
41
- and <code>DESC</code> for descending. If sort direction is not explicitly specified then by default
41
+ and <code>DESC</code> for descending. If sort direction is not explicitly specified, then by default
42
42
rows are sorted ascending. <br><br>
43
43
<b>Syntax:</b>
44
44
<code>
@@ -50,7 +50,7 @@ ORDER BY { expression [ sort_direction | nulls_sort_oder ] [ , ...] }
50
50
Optionally specifies whether NULL values are returned before/after non-NULL values, based on the
51
51
sort direction. In Spark, NULL values are considered to be lower than any non-NULL values by default.
52
52
Therefore the ordering of NULL values depend on the sort direction. If <code>null_sort_order</code> is
53
- not specified then NULLs sort first if sort order is <code>ASC</code> and NULLS sort last if
53
+ not specified, then NULLs sort first if sort order is <code>ASC</code> and NULLS sort last if
54
54
sort order is <code>DESC</code>.<br><br>
55
55
<ol>
56
56
<li> If <code>NULLS FIRST</code> (the default) is specified, then NULL values are returned first
0 commit comments