-
Notifications
You must be signed in to change notification settings - Fork 0
Story/cite-177 There needs to be an import from Crossref #22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
PradnyaC11
wants to merge
41
commits into
develop
Choose a base branch
from
story/CITE-177
base: develop
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 34 commits
Commits
Show all changes
41 commits
Select commit
Hold shift + click to select a range
0106646
[CITE-177] started on processor to import from crossref
jdamerow 656fab7
[CITE-177] Trying to add functionality for starting the import
PratikGiri a82d996
[CITE-177] Trying to add functionality for picking the job.
PratikGiri d3aa18f
[CITE-177] Adding function for starting the import
PratikGiri 1710f06
[CITE-177] Trying to add crossref import functionality
PratikGiri 10d8372
[CITE-177] Trying to create Iterator for Crossref
PratikGiri 27ee091
[CITE-177] Iterator changes.
PratikGiri f0c4b61
[CITE-177] Updating the Iterator and ImportProcessor
PratikGiri 58f195c
[CITE-177] Adding iterator
PratikGiri a2d4979
Adding iterator.
PratikGiri 9616319
[CITE-177] Correcting the iterator
PratikGiri e3856ea
[CITE-177] Updating the Crossref Iterator
PratikGiri 8c4545b
[CITE-177] CrossrefIterator and identifier
PratikGiri c2bfbd0
[CITE-177] Resolved error CrossrefReferenceImportProcessor class
PradnyaC11 74bcd5f
[CITE-177] Added CrossRef types to CrossRefPublication, and updated C…
PradnyaC11 7ecdfd0
[CITE-177] Updated CrossRefIterator
PradnyaC11 41fed60
[CITE-177] Updated CrossRefIterator
PradnyaC11 7f097df
[CITE-177] updated generateJson method of JsonGenerationService
PradnyaC11 46c2904
[CITE-177] Updated itemTypeMapping in crossref import processer
PradnyaC11 e36ff3d
[CITE-177] Added more mapping in CrossRefInportProcessor
PradnyaC11 1c94f97
[CITE-177] Udpated CrossRefIterator for typeMap and iterator logic
PradnyaC11 bed4968
[CITE-177] Added test cases for CrossrefReferenceImportProcessor
PradnyaC11 e48be13
[CITE-177] Refactoring code to fix issues.
PradnyaC11 c086f44
[CITE-177] Renamed file to remove unwanted file commit
PradnyaC11 287ffed
[CITE-177] Updated CrossRefIterator.java
PradnyaC11 5fd853e
[CITE-177] Addressed PR comments
PradnyaC11 c4c4f13
[CITE-177] Addressing PR comments
PradnyaC11 eb6a3a1
[CITE-177] Addressed PR comments
PradnyaC11 a06f8af
[CITE-177] Addressed PR comments
PradnyaC11 5767556
[CITE-177] Changed crossref-connect-version in pom.xml
PradnyaC11 bfcf53f
[CITE-177] Addressed PR comments
PradnyaC11 6670084
[CITE-177] Addressed PR comments
PradnyaC11 0023dd2
[CITE-177] Addressing PR comments
PradnyaC11 d954231
[CITE-177] Addressing PR comments
PradnyaC11 b7ded34
[CITE-177] Addressed PR comments
PradnyaC11 646c333
[CITE-177] Addressed PR comments
PradnyaC11 02bfdf3
[CITE-177] Resolved code factor issues
PradnyaC11 2575439
[CITE-177] Addressed PR comments
PradnyaC11 e8a1876
[CITE-177] Addressed PR comments
PradnyaC11 f32f4c0
[CITE-177] Updated dependency versions
PradnyaC11 3c80497
[CITE-177] Added javax.annotation dependency
PradnyaC11 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -9,7 +9,6 @@ | |
import org.slf4j.Logger; | ||
import org.slf4j.LoggerFactory; | ||
import org.springframework.beans.factory.annotation.Autowired; | ||
import org.springframework.stereotype.Service; | ||
|
||
import com.fasterxml.jackson.core.JsonProcessingException; | ||
import com.fasterxml.jackson.databind.JsonNode; | ||
|
@@ -18,7 +17,6 @@ | |
import com.fasterxml.jackson.databind.node.ObjectNode; | ||
|
||
import edu.asu.diging.citesphere.importer.core.exception.CitesphereCommunicationException; | ||
import edu.asu.diging.citesphere.importer.core.exception.IteratorCreationException; | ||
import edu.asu.diging.citesphere.importer.core.exception.MessageCreationException; | ||
import edu.asu.diging.citesphere.importer.core.kafka.impl.KafkaRequestProducer; | ||
import edu.asu.diging.citesphere.importer.core.model.BibEntry; | ||
|
@@ -27,7 +25,6 @@ | |
import edu.asu.diging.citesphere.importer.core.service.ICitesphereConnector; | ||
import edu.asu.diging.citesphere.importer.core.service.IImportProcessor; | ||
import edu.asu.diging.citesphere.importer.core.service.parse.BibEntryIterator; | ||
import edu.asu.diging.citesphere.importer.core.service.parse.IHandlerRegistry; | ||
import edu.asu.diging.citesphere.importer.core.zotero.IZoteroConnector; | ||
import edu.asu.diging.citesphere.importer.core.zotero.template.IJsonGenerationService; | ||
import edu.asu.diging.citesphere.messages.KafkaTopics; | ||
|
@@ -37,40 +34,28 @@ | |
import edu.asu.diging.citesphere.messages.model.ResponseCode; | ||
import edu.asu.diging.citesphere.messages.model.Status; | ||
|
||
/** | ||
* This class coordinates the import process. It connects with Citesphere and | ||
* downloads the files to be imported. It then starts the transformation process from | ||
* import format to internal bibliographical format and then turns the internal | ||
* bibliographical format to Json that can be submitted to Zotero. | ||
* @author jdamerow | ||
* | ||
*/ | ||
@Service | ||
public class ImportProcessor implements IImportProcessor { | ||
|
||
private final Logger logger = LoggerFactory.getLogger(getClass()); | ||
public abstract class AbstractImportProcessor implements IImportProcessor { | ||
|
||
protected final Logger logger = LoggerFactory.getLogger(getClass()); | ||
|
||
@Autowired | ||
private ICitesphereConnector connector; | ||
private KafkaRequestProducer requestProducer; | ||
|
||
@Autowired | ||
private IHandlerRegistry handlerRegistry; | ||
|
||
private ICitesphereConnector connector; | ||
@Autowired | ||
private IZoteroConnector zoteroConnector; | ||
|
||
@Autowired | ||
private IJsonGenerationService generationService; | ||
|
||
@Autowired | ||
private KafkaRequestProducer requestProducer; | ||
|
||
/** | ||
* Map that maps internal bibliographical formats (contants of {@link Publication} | ||
* class) to Zotero item types ({@link ItemType} enum). | ||
*/ | ||
private Map<String, ItemType> itemTypeMapping = new HashMap<>(); | ||
|
||
@PostConstruct | ||
public void init() { | ||
// this needs to be changed and improved, but for now it works | ||
|
@@ -81,45 +66,99 @@ public void init() { | |
itemTypeMapping.put(Publication.NEWS_ITEM, ItemType.NEWSPAPER_ARTICLE); | ||
itemTypeMapping.put(Publication.PROCEEDINGS_PAPER, ItemType.CONFERENCE_PAPER); | ||
itemTypeMapping.put(Publication.DOCUMENT, ItemType.DOCUMENT); | ||
itemTypeMapping.put(Publication.BOOK, ItemType.BOOK); | ||
itemTypeMapping.put(Publication.REFERNCE_ENTRY, ItemType.DICTIONARY_ENTRY); | ||
itemTypeMapping.put(Publication.POSTED_CONTENT, ItemType.WEBPAGE); | ||
itemTypeMapping.put(Publication.COMPONENT, ItemType.ATTACHMENT); | ||
itemTypeMapping.put(Publication.EDITED_BOOK, ItemType.BOOK); | ||
itemTypeMapping.put(Publication.PROCEEDINGS_PAPER, ItemType.CONFERENCE_PAPER); | ||
itemTypeMapping.put(Publication.DISSERTATION, ItemType.THESIS); | ||
itemTypeMapping.put(Publication.BOOK_CHAPTER, ItemType.BOOK_SECTION); | ||
|
||
itemTypeMapping.put(Publication.REPORT_COMPONENT, ItemType.REPORT); | ||
itemTypeMapping.put(Publication.REPORT, ItemType.REPORT); | ||
itemTypeMapping.put(Publication.PEER_REVIEW, ItemType.JOURNAL_ARTICLE); | ||
itemTypeMapping.put(Publication.BOOK_TRACK, ItemType.BOOK); | ||
itemTypeMapping.put(Publication.BOOK_PART, ItemType.BOOK_SECTION); | ||
itemTypeMapping.put(Publication.OTHER, ItemType.DOCUMENT); | ||
itemTypeMapping.put(Publication.BOOK_SET, ItemType.BOOK); | ||
itemTypeMapping.put(Publication.PROCEEDINGS, ItemType.CONFERENCE_PAPER); | ||
jdamerow marked this conversation as resolved.
Show resolved
Hide resolved
|
||
itemTypeMapping.put(Publication.DATABASE, ItemType.DATABASE); | ||
itemTypeMapping.put(Publication.STANDARD, ItemType.STATUTE); | ||
itemTypeMapping.put(Publication.REFERENCE_BOOK, ItemType.BOOK); | ||
itemTypeMapping.put(Publication.GRANT, ItemType.DOCUMENT); | ||
itemTypeMapping.put(Publication.DATASET, ItemType.DATABASE); | ||
} | ||
|
||
/* | ||
* (non-Javadoc) | ||
* | ||
* @see | ||
* edu.asu.diging.citesphere.importer.core.service.impl.IImportProcessor#process | ||
* (edu.asu.diging.citesphere.importer.core.kafka.impl.KafkaJobMessage) | ||
*/ | ||
|
||
@Override | ||
public void process(KafkaJobMessage message) { | ||
JobInfo info = getJobInfo(message); | ||
if (info == null) { | ||
sendMessage(null, message.getId(), Status.FAILED, ResponseCode.X10); | ||
return; | ||
} | ||
|
||
String filePath = downloadFile(message); | ||
if (filePath == null) { | ||
sendMessage(null, message.getId(), Status.FAILED, ResponseCode.X20); | ||
return; | ||
startImport(message, info); | ||
} | ||
|
||
private JobInfo getJobInfo(KafkaJobMessage message) { | ||
JobInfo info = null; | ||
try { | ||
info = connector.getJobInfo(message.getId()); | ||
} catch (CitesphereCommunicationException e) { | ||
logger.error("Could not get Zotero info.", e); | ||
return null; | ||
} | ||
|
||
sendMessage(null, message.getId(), Status.PROCESSING, ResponseCode.P00); | ||
BibEntryIterator bibIterator = null; | ||
return info; | ||
} | ||
|
||
protected void sendMessage(ItemCreationResponse message, String jobId, Status status, ResponseCode code) { | ||
KafkaImportReturnMessage returnMessage = new KafkaImportReturnMessage(message, jobId); | ||
returnMessage.setStatus(status); | ||
returnMessage.setCode(code); | ||
try { | ||
bibIterator = handlerRegistry.handleFile(info, filePath); | ||
} catch (IteratorCreationException e1) { | ||
logger.error("Could not create iterator.", e1); | ||
requestProducer.sendRequest(returnMessage, KafkaTopics.REFERENCES_IMPORT_DONE_TOPIC); | ||
} catch (MessageCreationException e) { | ||
logger.error("Exception sending message.", e); | ||
} | ||
} | ||
|
||
protected ICitesphereConnector getCitesphereConnector() { | ||
return connector; | ||
} | ||
|
||
private ItemCreationResponse submitEntries(ArrayNode entries, JobInfo info) { | ||
ObjectMapper mapper = new ObjectMapper(); | ||
try { | ||
String msg = mapper.writeValueAsString(entries); | ||
logger.info("Submitting " + msg); | ||
ItemCreationResponse response = zoteroConnector.addEntries(info, entries); | ||
if (response != null) { | ||
logger.info(response.getSuccessful() + ""); | ||
logger.error(response.getFailed() + ""); | ||
} else { | ||
logger.error("Item creation failed."); | ||
} | ||
return response; | ||
} catch (URISyntaxException e) { | ||
logger.error("Could not store new entry.", e); | ||
} catch (JsonProcessingException e) { | ||
logger.error("Could not write JSON."); | ||
} | ||
return null; | ||
} | ||
|
||
private void startImport(KafkaJobMessage message, JobInfo info) { | ||
ObjectMapper mapper = new ObjectMapper(); | ||
ArrayNode root = mapper.createArrayNode(); | ||
int entryCounter = 0; | ||
|
||
sendMessage(null, message.getId(), Status.PROCESSING, ResponseCode.P00); | ||
|
||
BibEntryIterator bibIterator = getBibEntryIterator(message, info); | ||
if (bibIterator == null) { | ||
sendMessage(null, message.getId(), Status.FAILED, ResponseCode.X30); | ||
return; | ||
} | ||
|
||
ObjectMapper mapper = new ObjectMapper(); | ||
ArrayNode root = mapper.createArrayNode(); | ||
int entryCounter = 0; | ||
|
||
while (bibIterator.hasNext()) { | ||
BibEntry entry = bibIterator.next(); | ||
if (entry.getArticleType() == null) { | ||
|
@@ -153,60 +192,5 @@ public void process(KafkaJobMessage message) { | |
sendMessage(response, message.getId(), Status.DONE, ResponseCode.S00); | ||
} | ||
|
||
private void sendMessage(ItemCreationResponse message, String jobId, Status status, ResponseCode code) { | ||
KafkaImportReturnMessage returnMessage = new KafkaImportReturnMessage(message, jobId); | ||
returnMessage.setStatus(status); | ||
returnMessage.setCode(code); | ||
try { | ||
requestProducer.sendRequest(returnMessage, KafkaTopics.REFERENCES_IMPORT_DONE_TOPIC); | ||
} catch (MessageCreationException e) { | ||
// FIXME handle this case | ||
logger.error("Exception sending message.", e); | ||
} | ||
} | ||
|
||
private ItemCreationResponse submitEntries(ArrayNode entries, JobInfo info) { | ||
ObjectMapper mapper = new ObjectMapper(); | ||
try { | ||
String msg = mapper.writeValueAsString(entries); | ||
logger.info("Submitting " + msg); | ||
ItemCreationResponse response = zoteroConnector.addEntries(info, entries); | ||
if (response != null) { | ||
logger.info(response.getSuccessful() + ""); | ||
logger.error(response.getFailed() + ""); | ||
} else { | ||
logger.error("Item creation failed."); | ||
} | ||
return response; | ||
} catch (URISyntaxException e) { | ||
logger.error("Could not store new entry.", e); | ||
} catch (JsonProcessingException e) { | ||
logger.error("Could not write JSON."); | ||
} | ||
return null; | ||
} | ||
|
||
private JobInfo getJobInfo(KafkaJobMessage message) { | ||
JobInfo info = null; | ||
try { | ||
info = connector.getJobInfo(message.getId()); | ||
} catch (CitesphereCommunicationException e) { | ||
// FIXME this needs to be handled better | ||
logger.error("Could not get Zotero info.", e); | ||
return null; | ||
} | ||
return info; | ||
} | ||
|
||
private String downloadFile(KafkaJobMessage message) { | ||
String file = null; | ||
try { | ||
file = connector.getUploadeFile(message.getId()); | ||
} catch (CitesphereCommunicationException e) { | ||
// FIXME this needs to be handled better | ||
logger.error("Could not get Zotero info.", e); | ||
return null; | ||
} | ||
return file; | ||
} | ||
protected abstract BibEntryIterator getBibEntryIterator(KafkaJobMessage message, JobInfo info); | ||
} |
16 changes: 16 additions & 0 deletions
16
...du/asu/diging/citesphere/importer/core/service/impl/CrossrefReferenceImportProcessor.java
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
package edu.asu.diging.citesphere.importer.core.service.impl; | ||
|
||
import org.springframework.stereotype.Service; | ||
|
||
import edu.asu.diging.citesphere.importer.core.service.parse.BibEntryIterator; | ||
import edu.asu.diging.citesphere.importer.core.service.parse.iterators.CrossRefIterator; | ||
import edu.asu.diging.citesphere.messages.model.KafkaJobMessage; | ||
|
||
@Service | ||
public class CrossrefReferenceImportProcessor extends AbstractImportProcessor { | ||
|
||
@Override | ||
protected BibEntryIterator getBibEntryIterator(KafkaJobMessage message, JobInfo info) { | ||
return new CrossRefIterator(info); | ||
} | ||
} |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
duplicate of line 63