Uses of Enum
ubic.basecode.ontology.providers.OntologyService.InferenceMode
Packages that use OntologyService.InferenceMode
Package
Description
Implementation of
OntologyService using Apache Jena.This package contains baseCode built-in ontologies and a
GenericOntologyService
to implement your own ontologies.-
Uses of OntologyService.InferenceMode in ubic.basecode.ontology.jena
Methods in ubic.basecode.ontology.jena that return OntologyService.InferenceModeMethods in ubic.basecode.ontology.jena with parameters of type OntologyService.InferenceModeModifier and TypeMethodDescriptionprotected com.hp.hpl.jena.ontology.OntModelSpecAbstractOntologyService.getSpec(OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected abstract OntologyModelAbstractOntologyService.loadModel(boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) Delegates the call as to load the model into memory or leave it on disk.protected OntologyModelClasspathOntologyService.loadModel(boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected OntologyModelTdbOntologyService.loadModel(boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected OntologyModelUrlOntologyService.loadModel(boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected abstract OntologyModelAbstractOntologyService.loadModelFromStream(InputStream is, boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) Load a model from a given input stream.protected OntologyModelClasspathOntologyService.loadModelFromStream(InputStream is, boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected OntologyModelTdbOntologyService.loadModelFromStream(InputStream is, boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected OntologyModelUrlOntologyService.loadModelFromStream(InputStream is, boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) voidAbstractOntologyService.setInferenceMode(OntologyService.InferenceMode inferenceMode) -
Uses of OntologyService.InferenceMode in ubic.basecode.ontology.providers
Methods in ubic.basecode.ontology.providers that return OntologyService.InferenceModeModifier and TypeMethodDescriptionAbstractDelegatingOntologyService.getInferenceMode()OntologyService.getInferenceMode()Obtain the inference mode used for this ontology.Returns the enum constant of this type with the specified name.static OntologyService.InferenceMode[]OntologyService.InferenceMode.values()Returns an array containing the constants of this enum type, in the order they are declared.Methods in ubic.basecode.ontology.providers with parameters of type OntologyService.InferenceModeModifier and TypeMethodDescriptionvoidAbstractDelegatingOntologyService.setInferenceMode(OntologyService.InferenceMode inferenceMode) voidOntologyService.setInferenceMode(OntologyService.InferenceMode inferenceMode) Set the inference mode used for this ontology.