The Official Java Client for Thordata APIs
Native implementation for maximum compatibility and performance.
The Thordata Java SDK provides a robust integration with Thordata's infrastructure. It features a custom socket-level implementation for proxy tunneling, ensuring 100% compatibility with Thordata's secure gateway authentication (TLS-in-TLS) where standard libraries often fail.
Key Features:
- 🛡️ Rock-Solid Proxying: Custom socket implementation supports Preemptive Authentication and SSL Tunneling perfectly.
- ⚡ Connection Pooling: Internal
HttpClientcache for high-throughput scenarios. - ☕ Pure Java: Minimal dependencies, leveraging
java.net.http(Java 11+). - 🧩 Lazy Validation: Flexible initialization for different use cases.
Add this to your pom.xml:
<dependency>
<groupId>com.thordata</groupId>
<artifactId>thordata-java-sdk</artifactId>
<version>1.2.0</version>
</dependency>import com.thordata.sdk.*;
// Auto-loads tokens from environment variables
ThordataConfig cfg = new ThordataConfig(
System.getenv("THORDATA_SCRAPER_TOKEN"),
null, null
);
ThordataClient client = new ThordataClient(cfg);// Create Proxy Config (Residential, US, Sticky)
ProxyConfig proxy = ProxyConfig.residentialFromEnv()
.country("us")
.city("new_york")
.sticky(10); // 10 min session
// This uses the custom socket implementation for max compatibility
ProxyResponse resp = client.proxyGet("https://httpbin.org/ip", proxy);
System.out.println("Status: " + resp.statusCode);
System.out.println("Body: " + resp.bodyText());SerpOptions opt = new SerpOptions();
opt.query = "Java threading";
opt.engine = "google";
opt.num = 10;
// Returns strongly-typed response object
SerpResponse result = client.serpSearch(opt);
System.out.println("Result count: " + result.organicResults.size());UniversalOptions opt = new UniversalOptions();
opt.url = "https://example.com/protected";
opt.jsRender = true;
opt.waitFor = ".content-loaded";
Object result = client.universalScrape(opt);// 1. Create Task
ScraperTaskOptions taskOpt = new ScraperTaskOptions();
taskOpt.fileName = "job_01";
taskOpt.spiderId = "universal";
taskOpt.spiderName = "universal";
taskOpt.parameters.put("url", "https://example.com");
String taskId = client.createScraperTask(taskOpt);
// 2. Poll Status & Get Result
while (true) {
String status = client.getTaskStatus(taskId);
if ("ready".equalsIgnoreCase(status)) {
String url = client.getTaskResult(taskId, "json");
System.out.println("Data URL: " + url);
break;
}
Thread.sleep(5000);
}MIT License.