Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: databricks/spark-redshift
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: master
Choose a base ref
...
head repository: databricks/spark-redshift
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: branch-1.x
Choose a head ref
  • 11 commits
  • 15 files changed
  • 7 contributors

Commits on Aug 4, 2016

  1. Custom JDBC column types back-port

    Back-port JDBC column types to 1.x branch.
    
    Author: Marc-Andre Tremblay <marcandre.tr@gmail.com>
    
    Closes #247 from nrstott/feature/custom-jdbc-column-types.
    marctrem authored and JoshRosen committed Aug 4, 2016
    Configuration menu
    Copy the full SHA
    5ed63ad View commit details
    Browse the repository at this point in the history
  2. Redshift doesn't support BLOB

    Author: Mathias Bogaert <mathias.bogaert@gmail.com>
    
    Closes #251 from analytically/patch-1.
    analytically authored and JoshRosen committed Aug 4, 2016
    Configuration menu
    Copy the full SHA
    237f947 View commit details
    Browse the repository at this point in the history

Commits on Aug 20, 2016

  1. Add support for JDBC 4.2

    This patch updates `RedshiftJDBCWrapper.getDriverClass` to automatically recognize the Redshift JDBC 4.2 driver.
    
    Fixes #258.
    
    Author: Travis Crawford <traviscrawford@gmail.com>
    
    Closes #259 from traviscrawford/travis/jdbc42.
    traviscrawford authored and JoshRosen committed Aug 20, 2016
    Configuration menu
    Copy the full SHA
    ebe5e1a View commit details
    Browse the repository at this point in the history
  2. Handle invalid S3 hostname exceptions with older aws-java-sdk versions

    We've seen a lot of messages lately regarding the "Invalid S3 URI: hostname does not appear to be a valid S3 endpoint" exception and so thought we should contribute our two cents and the code changes that worked for us. We've tried many approaches listed in that thread including using `spark.executor.extraClassPath` and `spark.driver.extraClassPath` environment variables to prepend to the classpath, including it in the assembled jar or as a shaded jar, Unfortunately many of these approaches failed mainly because we have on the machines themselves the older aws-java-sdk jar and that usually takes precedence. We ended up going with what Josh mentioned earlier about changing the S3 url in the spark-redshift code to add the endpoint to the host (`*.s3.amazonaws.com`).
    
    This logic will try to instantiate a new AmazonS3URI and if it fails, it'll try to add the default S3 Amazon domain to the host.
    
    Author: James Hou <jameshou@data101.udemy.com>
    Author: James Hou <james.hou@gmail.com>
    Author: Josh Rosen <joshrosen@databricks.com>
    
    Closes #254 from jameshou/feature/add-s3-full-endpoint-v1.
    James Hou authored and JoshRosen committed Aug 20, 2016
    Configuration menu
    Copy the full SHA
    8782802 View commit details
    Browse the repository at this point in the history

Commits on Aug 21, 2016

  1. Configuration menu
    Copy the full SHA
    34d03e9 View commit details
    Browse the repository at this point in the history
  2. Setting version to 1.1.0

    JoshRosen committed Aug 21, 2016
    Configuration menu
    Copy the full SHA
    33fa626 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    6e4ecd4 View commit details
    Browse the repository at this point in the history

Commits on Sep 8, 2016

  1. Configuration menu
    Copy the full SHA
    ad2498d View commit details
    Browse the repository at this point in the history

Commits on Sep 21, 2016

  1. Fix reading of NaN and Infinity

    This patch fixes a bug which caused `spark-redshift` to throw `NumberFormatException` when reading NaN or Infinity from Redshift.
    
    This patch fixes the bug by adding special-case handling of the string constants `nan`, `inf`, and `-inf`, which are the values sent back by Redshift during unloads. Note that we still do not support loads of `NaN` to Redshift since Redshift itself does not seem to support this yet (https://forums.aws.amazon.com/thread.jspa?threadID=236367).
    
    Fixes #261.
    
    Author: Josh Rosen <joshrosen@databricks.com>
    
    Closes #269 from JoshRosen/fix-nan.
    JoshRosen committed Sep 21, 2016
    Configuration menu
    Copy the full SHA
    0abd27d View commit details
    Browse the repository at this point in the history

Commits on Oct 18, 2016

  1. Updated README.md with sample code to read a Redshift table in SparkR

    Added sample code for reading a Redshift table with SparkR.
    
    Author: Ganesh Chand <ganeshchand@gmail.com>
    Author: Josh Rosen <joshrosen@databricks.com>
    
    Closes #282 from ganeshchand/patch-1.
    ganeshchand authored and JoshRosen committed Oct 18, 2016
    Configuration menu
    Copy the full SHA
    9f12f3f View commit details
    Browse the repository at this point in the history

Commits on Nov 16, 2016

  1. Wrap and re-throw Await.result exceptions in order to capture full st…

    …acktrace
    
    Exceptions thrown from Scala's `Await.result` don't include the waiting thread's stacktrace, making it hard to figure out where errors occur. Similar to the fix implemented in Spark in apache/spark#12433, this patch modifies our `Await.result` usages to wrap and rethrow exceptions to capture the calling thread's stack.
    
    Author: Josh Rosen <joshrosen@databricks.com>
    
    Closes #299 from JoshRosen/better-error-reporting.
    
    (cherry picked from commit b4c6053)
    Signed-off-by: Josh Rosen <joshrosen@databricks.com>
    JoshRosen committed Nov 16, 2016
    Configuration menu
    Copy the full SHA
    647f678 View commit details
    Browse the repository at this point in the history
Loading