|
| JobRequestHandler (RAFTConsensus *raft, RequestHandler *requestHandler, DatabaseConnection *database, Statistics *stats, std::string ip, int port) |
| Constructor method.
|
|
std::string | handleConnectRequest (boost::shared_ptr< TcpConnection > connection, std::string request) |
| Handles request from new node to connect to the network.
|
|
std::string | handleGetIPsRequest (std::string request, std::string client, std::string data) |
| Handles request for the ip adresses in the network.
|
|
std::string | handleUploadJobRequest (std::string request, std::string client, std::string data) |
| Handles request to upload one or more jobs with their priorities. More...
|
|
std::string | handleUploadJobRequest (std::string request, std::string client, std::vector< std::string > data) |
| Handles request to upload one or more jobs with their priorities. More...
|
|
std::string | handleGetJobRequest (std::string request, std::string client, std::string data) |
| Handles request to give the top job from the queue. More...
|
|
std::string | handleUpdateJobRequest (std::string request, std::string client, std::string data) |
| Handles the request to update the job time. More...
|
|
std::string | handleFinishJobRequest (std::string request, std::string client, std::string data) |
| Handles request to indicate the worker is finished with a job, successfully or not. More...
|
|
std::string | handleCrawlDataRequest (std::string request, std::string client, std::string data) |
| Handles request to upload crawl data to the job queue. More...
|
|
void | updateCrawlID (int id) |
| Updates the crawlID when crawl data is uploaded to the database. More...
|
|
DatabaseConnection * | getDatabaseConnection () |
|
|
int | crawlID |
| Variables describing the current crawlID which is needed by the crawler to crawl a specific part of GitHub and if there is currently a crawler working. More...
|
|
long long | timeLastCrawl = -1 |
|
◆ handleCrawlDataRequest()
std::string JobRequestHandler::handleCrawlDataRequest |
( |
std::string |
request, |
|
|
std::string |
client, |
|
|
std::string |
data |
|
) |
| |
Handles request to upload crawl data to the job queue.
- Parameters
-
data | Data is almost the same as in handleUploadJobRequest, but now with an ID in front. Data format is: "id'\n'url1?priority1'\n'url2?priority2'\n'..." |
- Returns
- Returns the result of handleUploadRequest.
◆ handleFinishJobRequest()
std::string JobRequestHandler::handleFinishJobRequest |
( |
std::string |
request, |
|
|
std::string |
client, |
|
|
std::string |
data |
|
) |
| |
Handles request to indicate the worker is finished with a job, successfully or not.
- Returns
- Response is "Job not currently expected." if a newer version of the job was given out, or if the job is not known to have been given out. Reponse is "Job finished successfully" on success and "Job failed successfully" on a client-side failure, which was correctly handled and uploaded. For a client-side failure, when the database fails to upload the current job to the failedjobs table, "Job could not be added to failed jobs list." is returned.
◆ handleGetJobRequest()
std::string JobRequestHandler::handleGetJobRequest |
( |
std::string |
request, |
|
|
std::string |
client, |
|
|
std::string |
data |
|
) |
| |
Handles request to give the top job from the queue.
- Returns
- Response is "Spider?url", where url is the url of the first job in the database if there are enough in the database or if a crawler is already working. Response is "Crawl?crawlID", where crawlID is the current crawlID if the number of jobs is not enough and there is no crawler working. Response is "NoJob" if there are no jobs and a crawler is already working.
◆ handleUpdateJobRequest()
std::string JobRequestHandler::handleUpdateJobRequest |
( |
std::string |
request, |
|
|
std::string |
client, |
|
|
std::string |
data |
|
) |
| |
Handles the request to update the job time.
- Returns
- The new time of the job.
◆ handleUploadJobRequest() [1/2]
std::string JobRequestHandler::handleUploadJobRequest |
( |
std::string |
request, |
|
|
std::string |
client, |
|
|
std::string |
data |
|
) |
| |
Handles request to upload one or more jobs with their priorities.
- Parameters
-
data | Consists of url and priority pairs, the url and priority are separated by the FIELD_DELIMITER_CHAR ('?') and the pairs by the ENTRY_DELIMITER_CHAR ('
'). Data format is "url1?priority1'\n'url2?priority2'\n'..." |
- Returns
- Response to user whether the job(s) has/have been uploaded succesfully or not.
◆ handleUploadJobRequest() [2/2]
std::string JobRequestHandler::handleUploadJobRequest |
( |
std::string |
request, |
|
|
std::string |
client, |
|
|
std::vector< std::string > |
data |
|
) |
| |
Handles request to upload one or more jobs with their priorities.
- Parameters
-
data | Consists of url and priority pairs, the url and priority are separated by the FIELD_DELIMITER_CHAR ('?') and the pairs by the ENTRY_DELIMITER_CHAR ('
'). Data format is "url1?priority1'\n'url2?priority2'\n'..." |
- Returns
- Response to user whether the job(s) has/have been uploaded succesfully or not.
◆ updateCrawlID()
void JobRequestHandler::updateCrawlID |
( |
int |
id | ) |
|
Updates the crawlID when crawl data is uploaded to the database.
- Parameters
-
◆ crawlID
int JobRequestHandler::crawlID |
Variables describing the current crawlID which is needed by the crawler to crawl a specific part of GitHub and if there is currently a crawler working.
The documentation for this class was generated from the following files:
- SearchSECODatabaseAPI/JobDistribution/JobRequestHandler.h
- SearchSECODatabaseAPI/JobDistribution/JobRequestHandler.cpp