Attempting to cleanup with clippy, rustfmt, etc.
Integrate https://github.com/mozilla/mentat/pull/806
This commit is contained in:
parent
8aec4048b9
commit
4f81c4e15b
52 changed files with 763 additions and 621 deletions
1
.gitignore
vendored
1
.gitignore
vendored
|
@ -92,3 +92,4 @@ build.xcarchive
|
|||
docs/_site
|
||||
docs/.sass-cache
|
||||
docs/.jekyll-metadata
|
||||
|
||||
|
|
19
.travis.yml
19
.travis.yml
|
@ -1,27 +1,38 @@
|
|||
language: rust
|
||||
cache: cargo # cache cargo-audit once installed
|
||||
before_script:
|
||||
- cargo install --git https://github.com/rust-lang/rust-clippy/ --force clippy
|
||||
- cargo install --force cargo-audit
|
||||
- cargo generate-lockfile
|
||||
script:
|
||||
- cargo audit
|
||||
# We use OSX so that we can get a reasonably up to date version of SQLCipher.
|
||||
# (The version in Travis's default Ubuntu Trusty is much too old).
|
||||
os: osx
|
||||
before_install:
|
||||
- brew install sqlcipher --with-fts
|
||||
- brew install sqlcipher
|
||||
rust:
|
||||
- 1.25.0
|
||||
- 1.41.0
|
||||
- stable
|
||||
- beta
|
||||
- nightly
|
||||
matrix:
|
||||
allow_failures:
|
||||
- rust: stable
|
||||
- rust: beta
|
||||
- rust: nightly
|
||||
fast_finish: true
|
||||
jobs:
|
||||
include:
|
||||
- stage: "Test iOS"
|
||||
rust: 1.25.0
|
||||
rust: 1.41.0
|
||||
script: ./scripts/test-ios.sh
|
||||
- stage: "Docs"
|
||||
rust: 1.25.0
|
||||
rust: 1.41.0
|
||||
script: ./scripts/cargo-doc.sh
|
||||
script:
|
||||
- cargo build --verbose --all
|
||||
- cargo clippy --all-targets --all-features -- -D warnings # Check tests and non-default crate features.
|
||||
- cargo test --verbose --all
|
||||
- cargo test --features edn/serde_support --verbose --all
|
||||
# We can't pick individual features out with `cargo test --all` (At the time of this writing, this
|
||||
|
|
|
@ -28,6 +28,13 @@ members = ["tools/cli", "ffi"]
|
|||
[build-dependencies]
|
||||
rustc_version = "0.2"
|
||||
|
||||
[dev-dependencies.cargo-husky]
|
||||
version = "1"
|
||||
default-features = false # Disable features which are enabled by default
|
||||
features = ["run-for-all", "precommit-hook", "run-cargo-fmt", "run-cargo-test", "run-cargo-check", "run-cargo-clippy"]
|
||||
# cargo audit
|
||||
# cargo outdated
|
||||
|
||||
[dependencies]
|
||||
chrono = "0.4"
|
||||
failure = "0.1.6"
|
||||
|
|
22
README.md
22
README.md
|
@ -1,17 +1,13 @@
|
|||
# Project Mentat
|
||||
[![Build Status](https://travis-ci.org/mozilla/mentat.svg?branch=master)](https://travis-ci.org/mozilla/mentat)
|
||||
|
||||
**Project Mentat is [no longer being developed or actively maintained by Mozilla](https://mail.mozilla.org/pipermail/firefox-dev/2018-September/006780.html).** This repository will be marked read-only in the near future. You are, of course, welcome to fork the repository and use the existing code.
|
||||
[![Build Status](https://travis-ci.org/qpdb/mentat.svg?branch=master)](https://travis-ci.org/qpdb/mentat)
|
||||
|
||||
Project Mentat is a persistent, embedded knowledge base. It draws heavily on [DataScript](https://github.com/tonsky/datascript) and [Datomic](http://datomic.com).
|
||||
|
||||
Mentat is implemented in Rust.
|
||||
This project was started by Mozilla, but [is no longer being developed or actively maintained by them](https://mail.mozilla.org/pipermail/firefox-dev/2018-September/006780.html). [Their repository](https://github.com/mozilla/mentat) was marked read-only, [this fork](https://github.com/qpdb/mentat) is an attempt to revive and continue that interesting work. We owe the team at Mozilla more than words can express for inspiring us all and for this project in particular.
|
||||
|
||||
The first version of Project Mentat, named Datomish, [was written in ClojureScript](https://github.com/mozilla/mentat/tree/clojure), targeting both Node (on top of `promise_sqlite`) and Firefox (on top of `Sqlite.jsm`). It also worked in pure Clojure on the JVM on top of `jdbc-sqlite`. The name was changed to avoid confusion with [Datomic](http://datomic.com).
|
||||
*Thank you*.
|
||||
|
||||
The Rust implementation gives us a smaller compiled output, better performance, more type safety, better tooling, and easier deployment into Firefox and mobile platforms.
|
||||
|
||||
[Documentation](https://mozilla.github.io/mentat)
|
||||
[Documentation](https://docs.rs/mentat)
|
||||
|
||||
---
|
||||
|
||||
|
@ -77,9 +73,11 @@ We've observed that data storage is a particular area of difficulty for software
|
|||
|
||||
DataScript asks the question: "What if creating a database were as cheap as creating a Hashmap?"
|
||||
|
||||
Mentat is not interested in that. Instead, it's strongly interested in persistence and performance, with very little interest in immutable databases/databases as values or throwaway use.
|
||||
Mentat is not interested in that. Instead, it's focused on persistence and performance, with very little interest in immutable databases/databases as values or throwaway use.
|
||||
|
||||
One might say that Mentat's question is: "What if an SQLite database could store arbitrary relations, for arbitrary consumers, without them having to coordinate an up-front storage-level schema?"
|
||||
One might say that Mentat's question is: "What if a database could store arbitrary relations, for arbitrary consumers, without them having to coordinate an up-front storage-level schema?"
|
||||
|
||||
Consider this a practical approach to facts, to knowledge its storage and access, much like SQLite is a practical RDBMS.
|
||||
|
||||
(Note that [domain-level schemas are very valuable](http://martinfowler.com/articles/schemaless/).)
|
||||
|
||||
|
@ -89,7 +87,7 @@ Some thought has been given to how databases as values — long-term references
|
|||
|
||||
Just like DataScript, Mentat speaks Datalog for querying and takes additions and retractions as input to a transaction.
|
||||
|
||||
Unlike DataScript, Mentat exposes free-text indexing, thanks to SQLite.
|
||||
Unlike DataScript, Mentat exposes free-text indexing, thanks to SQLite/FTS.
|
||||
|
||||
|
||||
## Comparison to Datomic
|
||||
|
@ -98,8 +96,6 @@ Datomic is a server-side, enterprise-grade data storage system. Datomic has a be
|
|||
|
||||
Many of these design decisions are inapplicable to deployed desktop software; indeed, the use of multiple JVM processes makes Datomic's use in a small desktop app, or a mobile device, prohibitive.
|
||||
|
||||
Mentat was designed for embedding, initially in an experimental Electron app ([Tofino](https://github.com/mozilla/tofino)). It is less concerned with exposing consistent database states outside transaction boundaries, because that's less important here, and dropping some of these requirements allows us to leverage SQLite itself.
|
||||
|
||||
|
||||
## Comparison to SQLite
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@ use std::process::exit;
|
|||
|
||||
/// MIN_VERSION should be changed when there's a new minimum version of rustc required
|
||||
/// to build the project.
|
||||
static MIN_VERSION: &'static str = "1.41.0";
|
||||
static MIN_VERSION: &str = "1.40.0";
|
||||
|
||||
fn main() {
|
||||
let ver = version().unwrap();
|
||||
|
|
|
@ -102,7 +102,7 @@ impl<V: TransactableValueMarker> Into<ValuePlace<V>> for KnownEntid {
|
|||
/// When moving to a more concrete table, such as `datoms`, they are expanded out
|
||||
/// via these flags and put into their own column rather than a bit field.
|
||||
pub enum AttributeBitFlags {
|
||||
IndexAVET = 1 << 0,
|
||||
IndexAVET = 1,
|
||||
IndexVAET = 1 << 1,
|
||||
IndexFulltext = 1 << 2,
|
||||
UniqueValue = 1 << 3,
|
||||
|
@ -327,20 +327,20 @@ impl ValueType {
|
|||
|
||||
pub fn from_keyword(keyword: &Keyword) -> Option<Self> {
|
||||
if keyword.namespace() != Some("db.type") {
|
||||
return None;
|
||||
None
|
||||
} else {
|
||||
match keyword.name() {
|
||||
"ref" => Some(ValueType::Ref),
|
||||
"boolean" => Some(ValueType::Boolean),
|
||||
"instant" => Some(ValueType::Instant),
|
||||
"long" => Some(ValueType::Long),
|
||||
"double" => Some(ValueType::Double),
|
||||
"string" => Some(ValueType::String),
|
||||
"keyword" => Some(ValueType::Keyword),
|
||||
"uuid" => Some(ValueType::Uuid),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
return match keyword.name() {
|
||||
"ref" => Some(ValueType::Ref),
|
||||
"boolean" => Some(ValueType::Boolean),
|
||||
"instant" => Some(ValueType::Instant),
|
||||
"long" => Some(ValueType::Long),
|
||||
"double" => Some(ValueType::Double),
|
||||
"string" => Some(ValueType::String),
|
||||
"keyword" => Some(ValueType::Keyword),
|
||||
"uuid" => Some(ValueType::Uuid),
|
||||
_ => None,
|
||||
};
|
||||
}
|
||||
|
||||
pub fn into_typed_value(self) -> TypedValue {
|
||||
|
@ -372,9 +372,9 @@ impl ValueType {
|
|||
}
|
||||
}
|
||||
|
||||
pub fn is_numeric(&self) -> bool {
|
||||
pub fn is_numeric(self) -> bool {
|
||||
match self {
|
||||
&ValueType::Long | &ValueType::Double => true,
|
||||
ValueType::Long | ValueType::Double => true,
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
@ -440,14 +440,14 @@ impl TypedValue {
|
|||
|
||||
pub fn value_type(&self) -> ValueType {
|
||||
match self {
|
||||
&TypedValue::Ref(_) => ValueType::Ref,
|
||||
&TypedValue::Boolean(_) => ValueType::Boolean,
|
||||
&TypedValue::Long(_) => ValueType::Long,
|
||||
&TypedValue::Instant(_) => ValueType::Instant,
|
||||
&TypedValue::Double(_) => ValueType::Double,
|
||||
&TypedValue::String(_) => ValueType::String,
|
||||
&TypedValue::Keyword(_) => ValueType::Keyword,
|
||||
&TypedValue::Uuid(_) => ValueType::Uuid,
|
||||
TypedValue::Ref(_) => ValueType::Ref,
|
||||
TypedValue::Boolean(_) => ValueType::Boolean,
|
||||
TypedValue::Long(_) => ValueType::Long,
|
||||
TypedValue::Instant(_) => ValueType::Instant,
|
||||
TypedValue::Double(_) => ValueType::Double,
|
||||
TypedValue::String(_) => ValueType::String,
|
||||
TypedValue::Keyword(_) => ValueType::Keyword,
|
||||
TypedValue::Uuid(_) => ValueType::Uuid,
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -770,21 +770,21 @@ impl Binding {
|
|||
|
||||
pub fn as_scalar(&self) -> Option<&TypedValue> {
|
||||
match self {
|
||||
&Binding::Scalar(ref v) => Some(v),
|
||||
Binding::Scalar(ref v) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_vec(&self) -> Option<&Vec<Binding>> {
|
||||
match self {
|
||||
&Binding::Vec(ref v) => Some(v),
|
||||
Binding::Vec(ref v) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_map(&self) -> Option<&StructuredMap> {
|
||||
match self {
|
||||
&Binding::Map(ref v) => Some(v),
|
||||
Binding::Map(ref v) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
@ -856,10 +856,10 @@ impl Binding {
|
|||
|
||||
pub fn value_type(&self) -> Option<ValueType> {
|
||||
match self {
|
||||
&Binding::Scalar(ref v) => Some(v.value_type()),
|
||||
Binding::Scalar(ref v) => Some(v.value_type()),
|
||||
|
||||
&Binding::Map(_) => None,
|
||||
&Binding::Vec(_) => None,
|
||||
Binding::Map(_) => None,
|
||||
Binding::Vec(_) => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -970,56 +970,56 @@ impl Binding {
|
|||
|
||||
pub fn as_entid(&self) -> Option<&Entid> {
|
||||
match self {
|
||||
&Binding::Scalar(TypedValue::Ref(ref v)) => Some(v),
|
||||
Binding::Scalar(TypedValue::Ref(ref v)) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_kw(&self) -> Option<&ValueRc<Keyword>> {
|
||||
match self {
|
||||
&Binding::Scalar(TypedValue::Keyword(ref v)) => Some(v),
|
||||
Binding::Scalar(TypedValue::Keyword(ref v)) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_boolean(&self) -> Option<&bool> {
|
||||
match self {
|
||||
&Binding::Scalar(TypedValue::Boolean(ref v)) => Some(v),
|
||||
Binding::Scalar(TypedValue::Boolean(ref v)) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_long(&self) -> Option<&i64> {
|
||||
match self {
|
||||
&Binding::Scalar(TypedValue::Long(ref v)) => Some(v),
|
||||
Binding::Scalar(TypedValue::Long(ref v)) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_double(&self) -> Option<&f64> {
|
||||
match self {
|
||||
&Binding::Scalar(TypedValue::Double(ref v)) => Some(&v.0),
|
||||
Binding::Scalar(TypedValue::Double(ref v)) => Some(&v.0),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_instant(&self) -> Option<&DateTime<Utc>> {
|
||||
match self {
|
||||
&Binding::Scalar(TypedValue::Instant(ref v)) => Some(v),
|
||||
Binding::Scalar(TypedValue::Instant(ref v)) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_string(&self) -> Option<&ValueRc<String>> {
|
||||
match self {
|
||||
&Binding::Scalar(TypedValue::String(ref v)) => Some(v),
|
||||
Binding::Scalar(TypedValue::String(ref v)) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_uuid(&self) -> Option<&Uuid> {
|
||||
match self {
|
||||
&Binding::Scalar(TypedValue::Uuid(ref v)) => Some(v),
|
||||
Binding::Scalar(TypedValue::Uuid(ref v)) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
|
|
@ -92,53 +92,53 @@ impl ValueTypeSet {
|
|||
self.0.insert(vt)
|
||||
}
|
||||
|
||||
pub fn len(&self) -> usize {
|
||||
pub fn len(self) -> usize {
|
||||
self.0.len()
|
||||
}
|
||||
|
||||
/// Returns a set containing all the types in this set and `other`.
|
||||
pub fn union(&self, other: &ValueTypeSet) -> ValueTypeSet {
|
||||
pub fn union(self, other: ValueTypeSet) -> ValueTypeSet {
|
||||
ValueTypeSet(self.0.union(other.0))
|
||||
}
|
||||
|
||||
pub fn intersection(&self, other: &ValueTypeSet) -> ValueTypeSet {
|
||||
pub fn intersection(self, other: ValueTypeSet) -> ValueTypeSet {
|
||||
ValueTypeSet(self.0.intersection(other.0))
|
||||
}
|
||||
|
||||
/// Returns the set difference between `self` and `other`, which is the
|
||||
/// set of items in `self` that are not in `other`.
|
||||
pub fn difference(&self, other: &ValueTypeSet) -> ValueTypeSet {
|
||||
pub fn difference(self, other: ValueTypeSet) -> ValueTypeSet {
|
||||
ValueTypeSet(self.0 - other.0)
|
||||
}
|
||||
|
||||
/// Return an arbitrary type that's part of this set.
|
||||
/// For a set containing a single type, this will be that type.
|
||||
pub fn exemplar(&self) -> Option<ValueType> {
|
||||
pub fn exemplar(self) -> Option<ValueType> {
|
||||
self.0.iter().next()
|
||||
}
|
||||
|
||||
pub fn is_subset(&self, other: &ValueTypeSet) -> bool {
|
||||
pub fn is_subset(self, other: ValueTypeSet) -> bool {
|
||||
self.0.is_subset(&other.0)
|
||||
}
|
||||
|
||||
/// Returns true if `self` and `other` contain no items in common.
|
||||
pub fn is_disjoint(&self, other: &ValueTypeSet) -> bool {
|
||||
pub fn is_disjoint(self, other: ValueTypeSet) -> bool {
|
||||
self.0.is_disjoint(&other.0)
|
||||
}
|
||||
|
||||
pub fn contains(&self, vt: ValueType) -> bool {
|
||||
pub fn contains(self, vt: ValueType) -> bool {
|
||||
self.0.contains(&vt)
|
||||
}
|
||||
|
||||
pub fn is_empty(&self) -> bool {
|
||||
pub fn is_empty(self) -> bool {
|
||||
self.0.is_empty()
|
||||
}
|
||||
|
||||
pub fn is_unit(&self) -> bool {
|
||||
pub fn is_unit(self) -> bool {
|
||||
self.0.len() == 1
|
||||
}
|
||||
|
||||
pub fn iter(&self) -> ::enum_set::Iter<ValueType> {
|
||||
pub fn iter(self) -> ::enum_set::Iter<ValueType> {
|
||||
self.0.iter()
|
||||
}
|
||||
}
|
||||
|
@ -150,8 +150,8 @@ impl From<ValueType> for ValueTypeSet {
|
|||
}
|
||||
|
||||
impl ValueTypeSet {
|
||||
pub fn is_only_numeric(&self) -> bool {
|
||||
self.is_subset(&ValueTypeSet::of_numeric_types())
|
||||
pub fn is_only_numeric(self) -> bool {
|
||||
self.is_subset(ValueTypeSet::of_numeric_types())
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -19,11 +19,11 @@ use edn::types::Value;
|
|||
/// Declare a lazy static `ident` of type `Value::Keyword` with the given `namespace` and
|
||||
/// `name`.
|
||||
///
|
||||
/// It may look surprising that we declare a new `lazy_static!` block rather than including
|
||||
/// It may look surprising to declare a new `lazy_static!` block rather than including
|
||||
/// invocations inside an existing `lazy_static!` block. The latter cannot be done, since macros
|
||||
/// are expanded outside-in. Looking at the `lazy_static!` source suggests that there is no harm in
|
||||
/// repeating that macro, since internally a multi-`static` block is expanded into many
|
||||
/// single-`static` blocks.
|
||||
/// will be expanded outside-in. Looking at the `lazy_static!` source suggests that there is no
|
||||
/// harm in repeating that macro, since internally a multi-`static` block will be expanded into
|
||||
/// many single-`static` blocks.
|
||||
///
|
||||
/// TODO: take just ":db.part/db" and define DB_PART_DB using "db.part" and "db".
|
||||
macro_rules! lazy_static_namespaced_keyword_value (
|
||||
|
|
|
@ -11,7 +11,7 @@
|
|||
use std::cell::Cell;
|
||||
use std::rc::Rc;
|
||||
|
||||
#[derive(Clone)]
|
||||
#[derive(Clone, Default)]
|
||||
pub struct RcCounter {
|
||||
c: Rc<Cell<usize>>,
|
||||
}
|
||||
|
|
|
@ -135,7 +135,7 @@ impl Schema {
|
|||
}
|
||||
|
||||
fn get_raw_entid(&self, x: &Keyword) -> Option<Entid> {
|
||||
self.ident_map.get(x).map(|x| *x)
|
||||
self.ident_map.get(x).copied()
|
||||
}
|
||||
|
||||
pub fn update_component_attributes(&mut self) {
|
||||
|
|
|
@ -69,7 +69,7 @@ impl ::std::fmt::Display for SchemaConstraintViolation {
|
|||
fn fmt(&self, f: &mut ::std::fmt::Formatter) -> ::std::fmt::Result {
|
||||
use self::SchemaConstraintViolation::*;
|
||||
match self {
|
||||
&ConflictingUpserts {
|
||||
ConflictingUpserts {
|
||||
ref conflicting_upserts,
|
||||
} => {
|
||||
writeln!(f, "conflicting upserts:")?;
|
||||
|
@ -78,7 +78,7 @@ impl ::std::fmt::Display for SchemaConstraintViolation {
|
|||
}
|
||||
Ok(())
|
||||
}
|
||||
&TypeDisagreements {
|
||||
TypeDisagreements {
|
||||
ref conflicting_datoms,
|
||||
} => {
|
||||
writeln!(f, "type disagreements:")?;
|
||||
|
@ -91,9 +91,9 @@ impl ::std::fmt::Display for SchemaConstraintViolation {
|
|||
}
|
||||
Ok(())
|
||||
}
|
||||
&CardinalityConflicts { ref conflicts } => {
|
||||
CardinalityConflicts { ref conflicts } => {
|
||||
writeln!(f, "cardinality conflicts:")?;
|
||||
for ref conflict in conflicts {
|
||||
for conflict in conflicts {
|
||||
writeln!(f, " {:?}", conflict)?;
|
||||
}
|
||||
Ok(())
|
||||
|
@ -116,10 +116,10 @@ impl ::std::fmt::Display for InputError {
|
|||
fn fmt(&self, f: &mut ::std::fmt::Formatter) -> ::std::fmt::Result {
|
||||
use self::InputError::*;
|
||||
match self {
|
||||
&BadDbId => {
|
||||
BadDbId => {
|
||||
writeln!(f, ":db/id in map notation must either not be present or be an entid, an ident, or a tempid")
|
||||
},
|
||||
&BadEntityPlace => {
|
||||
BadEntityPlace => {
|
||||
writeln!(f, "cannot convert value place into entity place")
|
||||
},
|
||||
}
|
||||
|
@ -163,7 +163,7 @@ impl From<DbErrorKind> for DbError {
|
|||
|
||||
impl From<Context<DbErrorKind>> for DbError {
|
||||
fn from(inner: Context<DbErrorKind>) -> Self {
|
||||
DbError { inner: inner }
|
||||
DbError { inner }
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -48,12 +48,10 @@ where
|
|||
} else {
|
||||
self.asserted.insert(key, value);
|
||||
}
|
||||
} else if let Some(asserted_value) = self.asserted.remove(&key) {
|
||||
self.altered.insert(key, (value, asserted_value));
|
||||
} else {
|
||||
if let Some(asserted_value) = self.asserted.remove(&key) {
|
||||
self.altered.insert(key, (value, asserted_value));
|
||||
} else {
|
||||
self.retracted.insert(key, value);
|
||||
}
|
||||
self.retracted.insert(key, value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -27,7 +27,7 @@ use types::{Partition, PartitionMap};
|
|||
/// The first transaction ID applied to the knowledge base.
|
||||
///
|
||||
/// This is the start of the :db.part/tx partition.
|
||||
pub const TX0: i64 = 0x10000000;
|
||||
pub const TX0: i64 = 0x1000_0000;
|
||||
|
||||
/// This is the start of the :db.part/user partition.
|
||||
pub const USER0: i64 = 0x10000;
|
||||
|
@ -206,14 +206,14 @@ lazy_static! {
|
|||
/// Convert (ident, entid) pairs into [:db/add IDENT :db/ident IDENT] `Value` instances.
|
||||
fn idents_to_assertions(idents: &[(symbols::Keyword, i64)]) -> Vec<Value> {
|
||||
idents
|
||||
.into_iter()
|
||||
.iter()
|
||||
.map(|&(ref ident, _)| {
|
||||
let value = Value::Keyword(ident.clone());
|
||||
Value::Vector(vec![
|
||||
values::DB_ADD.clone(),
|
||||
value.clone(),
|
||||
values::DB_IDENT.clone(),
|
||||
value.clone(),
|
||||
value,
|
||||
])
|
||||
})
|
||||
.collect()
|
||||
|
@ -225,7 +225,7 @@ fn schema_attrs_to_assertions(version: u32, idents: &[symbols::Keyword]) -> Vec<
|
|||
let schema_attr = Value::Keyword(ns_keyword!("db.schema", "attribute"));
|
||||
let schema_version = Value::Keyword(ns_keyword!("db.schema", "version"));
|
||||
idents
|
||||
.into_iter()
|
||||
.iter()
|
||||
.map(|ident| {
|
||||
let value = Value::Keyword(ident.clone());
|
||||
Value::Vector(vec![
|
||||
|
@ -260,7 +260,7 @@ fn symbolic_schema_to_triples(
|
|||
Value::Map(ref m) => {
|
||||
for (ident, mp) in m {
|
||||
let ident = match ident {
|
||||
&Value::Keyword(ref ident) => ident,
|
||||
Value::Keyword(ref ident) => ident,
|
||||
_ => bail!(DbErrorKind::BadBootstrapDefinition(format!(
|
||||
"Expected namespaced keyword for ident but got '{:?}'",
|
||||
ident
|
||||
|
@ -270,7 +270,7 @@ fn symbolic_schema_to_triples(
|
|||
Value::Map(ref mpp) => {
|
||||
for (attr, value) in mpp {
|
||||
let attr = match attr {
|
||||
&Value::Keyword(ref attr) => attr,
|
||||
Value::Keyword(ref attr) => attr,
|
||||
_ => bail!(DbErrorKind::BadBootstrapDefinition(format!(
|
||||
"Expected namespaced keyword for attr but got '{:?}'",
|
||||
attr
|
||||
|
@ -289,7 +289,7 @@ fn symbolic_schema_to_triples(
|
|||
Some(TypedValue::Keyword(ref k)) => ident_map
|
||||
.get(k)
|
||||
.map(|entid| TypedValue::Ref(*entid))
|
||||
.ok_or(DbErrorKind::UnrecognizedIdent(k.to_string()))?,
|
||||
.ok_or_else(|| DbErrorKind::UnrecognizedIdent(k.to_string()))?,
|
||||
Some(v) => v,
|
||||
_ => bail!(DbErrorKind::BadBootstrapDefinition(format!(
|
||||
"Expected Mentat typed value for value but got '{:?}'",
|
||||
|
@ -377,8 +377,6 @@ pub(crate) fn bootstrap_entities() -> Vec<Entity<edn::ValueAndSpan>> {
|
|||
);
|
||||
|
||||
// Failure here is a coding error (since the inputs are fixed), not a runtime error.
|
||||
// TODO: represent these bootstrap data errors rather than just panicing.
|
||||
let bootstrap_entities: Vec<Entity<edn::ValueAndSpan>> =
|
||||
edn::parse::entities(&bootstrap_assertions.to_string()).expect("bootstrap assertions");
|
||||
return bootstrap_entities;
|
||||
// TODO: represent these bootstrap entity data errors rather than just panicing.
|
||||
edn::parse::entities(&bootstrap_assertions.to_string()).expect("bootstrap assertions")
|
||||
}
|
||||
|
|
|
@ -54,8 +54,6 @@ use std::collections::btree_map::Entry::{Occupied, Vacant};
|
|||
|
||||
use std::iter::once;
|
||||
|
||||
use std::mem;
|
||||
|
||||
use std::sync::Arc;
|
||||
|
||||
use std::iter::Peekable;
|
||||
|
@ -190,7 +188,7 @@ impl AevFactory {
|
|||
return existing;
|
||||
}
|
||||
self.strings.insert(rc.clone());
|
||||
return TypedValue::String(rc);
|
||||
TypedValue::String(rc)
|
||||
}
|
||||
t => t,
|
||||
}
|
||||
|
@ -377,7 +375,7 @@ impl RemoveFromCache for MultiValAttributeCache {
|
|||
|
||||
impl CardinalityManyCache for MultiValAttributeCache {
|
||||
fn acc(&mut self, e: Entid, v: TypedValue) {
|
||||
self.e_vs.entry(e).or_insert(vec![]).push(v)
|
||||
self.e_vs.entry(e).or_insert_with(|| vec![]).push(v)
|
||||
}
|
||||
|
||||
fn set(&mut self, e: Entid, vs: Vec<TypedValue>) {
|
||||
|
@ -439,7 +437,7 @@ impl UniqueReverseAttributeCache {
|
|||
}
|
||||
|
||||
fn get_e(&self, v: &TypedValue) -> Option<Entid> {
|
||||
self.v_e.get(v).and_then(|o| o.clone())
|
||||
self.v_e.get(v).and_then(|o| *o)
|
||||
}
|
||||
|
||||
fn lookup(&self, v: &TypedValue) -> Option<Option<Entid>> {
|
||||
|
@ -494,7 +492,7 @@ impl RemoveFromCache for NonUniqueReverseAttributeCache {
|
|||
|
||||
impl NonUniqueReverseAttributeCache {
|
||||
fn acc(&mut self, e: Entid, v: TypedValue) {
|
||||
self.v_es.entry(v).or_insert(BTreeSet::new()).insert(e);
|
||||
self.v_es.entry(v).or_insert_with(BTreeSet::new).insert(e);
|
||||
}
|
||||
|
||||
fn get_es(&self, v: &TypedValue) -> Option<&BTreeSet<Entid>> {
|
||||
|
@ -643,9 +641,9 @@ enum AccumulationBehavior {
|
|||
}
|
||||
|
||||
impl AccumulationBehavior {
|
||||
fn is_replacing(&self) -> bool {
|
||||
fn is_replacing(self) -> bool {
|
||||
match self {
|
||||
&AccumulationBehavior::Add { replacing } => replacing,
|
||||
AccumulationBehavior::Add { replacing } => replacing,
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
@ -662,7 +660,7 @@ pub struct AttributeCaches {
|
|||
non_unique_reverse: BTreeMap<Entid, NonUniqueReverseAttributeCache>,
|
||||
}
|
||||
|
||||
// TODO: if an entity or attribute is ever renumbered, the cache will need to be rebuilt.
|
||||
// TODO: if an entity or attribute is ever re-numbered, the cache will need to be rebuilt.
|
||||
impl AttributeCaches {
|
||||
//
|
||||
// These function names are brief and local.
|
||||
|
@ -1006,7 +1004,7 @@ impl AttributeCaches {
|
|||
}
|
||||
}
|
||||
|
||||
// We need this block for fallback.
|
||||
// We need this block for fall-back.
|
||||
impl AttributeCaches {
|
||||
fn get_entid_for_value_if_present(
|
||||
&self,
|
||||
|
@ -1076,7 +1074,7 @@ impl AttributeCaches {
|
|||
) -> Result<()> {
|
||||
let mut aev_factory = AevFactory::new();
|
||||
let rows = statement.query_map(&args, |row| Ok(aev_factory.row_to_aev(row)))?;
|
||||
let aevs = AevRows { rows: rows };
|
||||
let aevs = AevRows { rows };
|
||||
self.accumulate_into_cache(
|
||||
None,
|
||||
schema,
|
||||
|
@ -1132,7 +1130,7 @@ impl AttributeCaches {
|
|||
schema: &'s Schema,
|
||||
sqlite: &'c rusqlite::Connection,
|
||||
attrs: AttributeSpec,
|
||||
entities: &Vec<Entid>,
|
||||
entities: &[Entid],
|
||||
) -> Result<()> {
|
||||
// Mark the attributes as cached as we go. We do this because we're going in through the
|
||||
// back door here, and the usual caching API won't have taken care of this for us.
|
||||
|
@ -1229,17 +1227,17 @@ impl AttributeCaches {
|
|||
schema: &'s Schema,
|
||||
sqlite: &'c rusqlite::Connection,
|
||||
mut attrs: AttributeSpec,
|
||||
entities: &Vec<Entid>,
|
||||
entities: &[Entid],
|
||||
) -> Result<()> {
|
||||
// TODO: Exclude any entities for which every attribute is known.
|
||||
// TODO: initialize from an existing (complete) AttributeCache.
|
||||
|
||||
// Exclude any attributes for which every entity's value is already known.
|
||||
match &mut attrs {
|
||||
&mut AttributeSpec::All => {
|
||||
AttributeSpec::All => {
|
||||
// If we're caching all attributes, there's nothing we can exclude.
|
||||
}
|
||||
&mut AttributeSpec::Specified {
|
||||
AttributeSpec::Specified {
|
||||
ref mut non_fts,
|
||||
ref mut fts,
|
||||
} => {
|
||||
|
@ -1285,7 +1283,7 @@ impl AttributeCaches {
|
|||
schema: &'s Schema,
|
||||
sqlite: &'c rusqlite::Connection,
|
||||
attrs: AttributeSpec,
|
||||
entities: &Vec<Entid>,
|
||||
entities: &[Entid],
|
||||
) -> Result<AttributeCaches> {
|
||||
let mut cache = AttributeCaches::default();
|
||||
cache.populate_cache_for_entities_and_attributes(schema, sqlite, attrs, entities)?;
|
||||
|
@ -1450,7 +1448,7 @@ pub struct SQLiteAttributeCache {
|
|||
}
|
||||
|
||||
impl SQLiteAttributeCache {
|
||||
fn make_mut<'s>(&'s mut self) -> &'s mut AttributeCaches {
|
||||
fn make_mut(&mut self) -> &mut AttributeCaches {
|
||||
Arc::make_mut(&mut self.inner)
|
||||
}
|
||||
|
||||
|
@ -1628,7 +1626,7 @@ impl InProgressSQLiteAttributeCache {
|
|||
let overlay = inner.make_override();
|
||||
InProgressSQLiteAttributeCache {
|
||||
inner: inner.inner,
|
||||
overlay: overlay,
|
||||
overlay,
|
||||
unregistered_forward: Default::default(),
|
||||
unregistered_reverse: Default::default(),
|
||||
}
|
||||
|
@ -1818,9 +1816,7 @@ impl CachedAttributes for InProgressSQLiteAttributeCache {
|
|||
.inner
|
||||
.forward_cached_attributes
|
||||
.iter()
|
||||
.filter(|a| !self.unregistered_forward.contains(a))
|
||||
.next()
|
||||
.is_some()
|
||||
.any(|a| !self.unregistered_forward.contains(a))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
@ -1828,9 +1824,7 @@ impl CachedAttributes for InProgressSQLiteAttributeCache {
|
|||
self.inner
|
||||
.reverse_cached_attributes
|
||||
.iter()
|
||||
.filter(|a| !self.unregistered_reverse.contains(a))
|
||||
.next()
|
||||
.is_some()
|
||||
.any(|a| !self.unregistered_reverse.contains(a))
|
||||
}
|
||||
|
||||
fn get_entids_for_value(
|
||||
|
@ -1944,7 +1938,7 @@ impl<'a> InProgressCacheTransactWatcher<'a> {
|
|||
let mut w = InProgressCacheTransactWatcher {
|
||||
collected_assertions: Default::default(),
|
||||
collected_retractions: Default::default(),
|
||||
cache: cache,
|
||||
cache,
|
||||
active: true,
|
||||
};
|
||||
|
||||
|
@ -1977,10 +1971,10 @@ impl<'a> TransactWatcher for InProgressCacheTransactWatcher<'a> {
|
|||
}
|
||||
Entry::Occupied(mut entry) => {
|
||||
match entry.get_mut() {
|
||||
&mut Either::Left(_) => {
|
||||
Either::Left(_) => {
|
||||
// Nothing to do.
|
||||
}
|
||||
&mut Either::Right(ref mut vec) => {
|
||||
Either::Right(ref mut vec) => {
|
||||
vec.push((e, v.clone()));
|
||||
}
|
||||
}
|
||||
|
@ -1989,14 +1983,12 @@ impl<'a> TransactWatcher for InProgressCacheTransactWatcher<'a> {
|
|||
}
|
||||
|
||||
fn done(&mut self, _t: &Entid, schema: &Schema) -> Result<()> {
|
||||
// Oh, I wish we had impl trait. Without it we have a six-line type signature if we
|
||||
// Oh, how I wish we had `impl trait`. Without it we have a six-line type signature if we
|
||||
// try to break this out as a helper function.
|
||||
let collected_retractions =
|
||||
mem::replace(&mut self.collected_retractions, Default::default());
|
||||
let collected_assertions = mem::replace(&mut self.collected_assertions, Default::default());
|
||||
let collected_retractions = std::mem::take(&mut self.collected_retractions);
|
||||
let collected_assertions = std::mem::take(&mut self.collected_assertions);
|
||||
let mut intermediate_expansion = once(collected_retractions)
|
||||
.chain(once(collected_assertions))
|
||||
.into_iter()
|
||||
.map(move |tree| {
|
||||
tree.into_iter()
|
||||
.filter_map(move |(a, evs)| {
|
||||
|
@ -2018,7 +2010,7 @@ impl<'a> TransactWatcher for InProgressCacheTransactWatcher<'a> {
|
|||
}
|
||||
|
||||
impl InProgressSQLiteAttributeCache {
|
||||
pub fn transact_watcher<'a>(&'a mut self) -> InProgressCacheTransactWatcher<'a> {
|
||||
pub fn transact_watcher(&mut self) -> InProgressCacheTransactWatcher {
|
||||
InProgressCacheTransactWatcher::new(self)
|
||||
}
|
||||
}
|
||||
|
|
112
db/src/db.rs
112
db/src/db.rs
|
@ -66,10 +66,9 @@ fn make_connection(
|
|||
let page_size = 32768;
|
||||
|
||||
let initial_pragmas = if let Some(encryption_key) = maybe_encryption_key {
|
||||
assert!(
|
||||
cfg!(feature = "sqlcipher"),
|
||||
"This function shouldn't be called with a key unless we have sqlcipher support"
|
||||
);
|
||||
if !cfg!(feature = "sqlcipher") {
|
||||
panic!("This function shouldn't be called with a key unless we have sqlcipher support");
|
||||
}
|
||||
// Important: The `cipher_page_size` cannot be changed without breaking
|
||||
// the ability to open databases that were written when using a
|
||||
// different `cipher_page_size`. Additionally, it (AFAICT) must be a
|
||||
|
@ -147,10 +146,10 @@ pub const CURRENT_VERSION: i32 = 1;
|
|||
|
||||
/// MIN_SQLITE_VERSION should be changed when there's a new minimum version of sqlite required
|
||||
/// for the project to work.
|
||||
const MIN_SQLITE_VERSION: i32 = 3008000;
|
||||
const MIN_SQLITE_VERSION: i32 = 3_008_000;
|
||||
|
||||
const TRUE: &'static bool = &true;
|
||||
const FALSE: &'static bool = &false;
|
||||
const TRUE: &bool = &true;
|
||||
const FALSE: &bool = &false;
|
||||
|
||||
/// Turn an owned bool into a static reference to a bool.
|
||||
///
|
||||
|
@ -360,9 +359,10 @@ pub fn create_current_version(conn: &mut rusqlite::Connection) -> Result<DB> {
|
|||
// TODO: validate metadata mutations that aren't schema related, like additional partitions.
|
||||
if let Some(next_schema) = next_schema {
|
||||
if next_schema != db.schema {
|
||||
bail!(DbErrorKind::NotYetImplemented(format!(
|
||||
bail!(DbErrorKind::NotYetImplemented(
|
||||
"Initial bootstrap transaction did not produce expected bootstrap schema"
|
||||
)));
|
||||
.to_string()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -396,7 +396,7 @@ pub trait TypedSQLValue {
|
|||
value: rusqlite::types::Value,
|
||||
value_type_tag: i32,
|
||||
) -> Result<TypedValue>;
|
||||
fn to_sql_value_pair<'a>(&'a self) -> (ToSqlOutput<'a>, i32);
|
||||
fn to_sql_value_pair(&self) -> (ToSqlOutput, i32);
|
||||
fn from_edn_value(value: &Value) -> Option<TypedValue>;
|
||||
fn to_edn_value_pair(&self) -> (Value, ValueType);
|
||||
}
|
||||
|
@ -446,43 +446,43 @@ impl TypedSQLValue for TypedValue {
|
|||
/// This function is deterministic.
|
||||
fn from_edn_value(value: &Value) -> Option<TypedValue> {
|
||||
match value {
|
||||
&Value::Boolean(x) => Some(TypedValue::Boolean(x)),
|
||||
&Value::Instant(x) => Some(TypedValue::Instant(x)),
|
||||
&Value::Integer(x) => Some(TypedValue::Long(x)),
|
||||
&Value::Uuid(x) => Some(TypedValue::Uuid(x)),
|
||||
&Value::Float(ref x) => Some(TypedValue::Double(x.clone())),
|
||||
&Value::Text(ref x) => Some(x.clone().into()),
|
||||
&Value::Keyword(ref x) => Some(x.clone().into()),
|
||||
Value::Boolean(x) => Some(TypedValue::Boolean(*x)),
|
||||
Value::Instant(x) => Some(TypedValue::Instant(*x)),
|
||||
Value::Integer(x) => Some(TypedValue::Long(*x)),
|
||||
Value::Uuid(x) => Some(TypedValue::Uuid(*x)),
|
||||
Value::Float(ref x) => Some(TypedValue::Double(*x)),
|
||||
Value::Text(ref x) => Some(x.clone().into()),
|
||||
Value::Keyword(ref x) => Some(x.clone().into()),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the corresponding SQLite `value` and `value_type_tag` pair.
|
||||
fn to_sql_value_pair<'a>(&'a self) -> (ToSqlOutput<'a>, i32) {
|
||||
fn to_sql_value_pair(&self) -> (ToSqlOutput, i32) {
|
||||
match self {
|
||||
&TypedValue::Ref(x) => (x.into(), 0),
|
||||
&TypedValue::Boolean(x) => ((if x { 1 } else { 0 }).into(), 1),
|
||||
&TypedValue::Instant(x) => (x.to_micros().into(), 4),
|
||||
TypedValue::Ref(x) => ((*x).into(), 0),
|
||||
TypedValue::Boolean(x) => ((if *x { 1 } else { 0 }).into(), 1),
|
||||
TypedValue::Instant(x) => (x.to_micros().into(), 4),
|
||||
// SQLite distinguishes integral from decimal types, allowing long and double to share a tag.
|
||||
&TypedValue::Long(x) => (x.into(), 5),
|
||||
&TypedValue::Double(x) => (x.into_inner().into(), 5),
|
||||
&TypedValue::String(ref x) => (x.as_str().into(), 10),
|
||||
&TypedValue::Uuid(ref u) => (u.as_bytes().to_vec().into(), 11),
|
||||
&TypedValue::Keyword(ref x) => (x.to_string().into(), 13),
|
||||
TypedValue::Long(x) => ((*x).into(), 5),
|
||||
TypedValue::Double(x) => (x.into_inner().into(), 5),
|
||||
TypedValue::String(ref x) => (x.as_str().into(), 10),
|
||||
TypedValue::Uuid(ref u) => (u.as_bytes().to_vec().into(), 11),
|
||||
TypedValue::Keyword(ref x) => (x.to_string().into(), 13),
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the corresponding EDN `value` and `value_type` pair.
|
||||
fn to_edn_value_pair(&self) -> (Value, ValueType) {
|
||||
match self {
|
||||
&TypedValue::Ref(x) => (Value::Integer(x), ValueType::Ref),
|
||||
&TypedValue::Boolean(x) => (Value::Boolean(x), ValueType::Boolean),
|
||||
&TypedValue::Instant(x) => (Value::Instant(x), ValueType::Instant),
|
||||
&TypedValue::Long(x) => (Value::Integer(x), ValueType::Long),
|
||||
&TypedValue::Double(x) => (Value::Float(x), ValueType::Double),
|
||||
&TypedValue::String(ref x) => (Value::Text(x.as_ref().clone()), ValueType::String),
|
||||
&TypedValue::Uuid(ref u) => (Value::Uuid(u.clone()), ValueType::Uuid),
|
||||
&TypedValue::Keyword(ref x) => (Value::Keyword(x.as_ref().clone()), ValueType::Keyword),
|
||||
TypedValue::Ref(x) => (Value::Integer(*x), ValueType::Ref),
|
||||
TypedValue::Boolean(x) => (Value::Boolean(*x), ValueType::Boolean),
|
||||
TypedValue::Instant(x) => (Value::Instant(*x), ValueType::Instant),
|
||||
TypedValue::Long(x) => (Value::Integer(*x), ValueType::Long),
|
||||
TypedValue::Double(x) => (Value::Float(*x), ValueType::Double),
|
||||
TypedValue::String(ref x) => (Value::Text(x.as_ref().clone()), ValueType::String),
|
||||
TypedValue::Uuid(ref u) => (Value::Uuid(*u), ValueType::Uuid),
|
||||
TypedValue::Keyword(ref x) => (Value::Keyword(x.as_ref().clone()), ValueType::Keyword),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -510,7 +510,7 @@ pub fn read_partition_map(conn: &rusqlite::Connection) -> Result<PartitionMap> {
|
|||
// First part of the union sprinkles 'allow_excision' into the 'parts' view.
|
||||
// Second part of the union takes care of partitions which are known
|
||||
// but don't have any transactions.
|
||||
let mut stmt: rusqlite::Statement = conn.prepare(
|
||||
conn.prepare(
|
||||
"
|
||||
SELECT
|
||||
known_parts.part,
|
||||
|
@ -536,16 +536,14 @@ pub fn read_partition_map(conn: &rusqlite::Connection) -> Result<PartitionMap> {
|
|||
known_parts
|
||||
WHERE
|
||||
part NOT IN (SELECT part FROM parts)",
|
||||
)?;
|
||||
let m = stmt
|
||||
.query_and_then(rusqlite::params![], |row| -> Result<(String, Partition)> {
|
||||
Ok((
|
||||
row.get(0)?,
|
||||
Partition::new(row.get(1)?, row.get(2)?, row.get(3)?, row.get(4)?),
|
||||
))
|
||||
})?
|
||||
.collect();
|
||||
m
|
||||
)?
|
||||
.query_and_then(rusqlite::params![], |row| -> Result<(String, Partition)> {
|
||||
Ok((
|
||||
row.get(0)?,
|
||||
Partition::new(row.get(1)?, row.get(2)?, row.get(3)?, row.get(4)?),
|
||||
))
|
||||
})?
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Read the ident map materialized view from the given SQL store.
|
||||
|
@ -767,7 +765,7 @@ impl MentatStoring for rusqlite::Connection {
|
|||
//
|
||||
// TODO: `collect` into a HashSet so that any (a, v) is resolved at most once.
|
||||
let max_vars = self.limit(Limit::SQLITE_LIMIT_VARIABLE_NUMBER) as usize;
|
||||
let chunks: itertools::IntoChunks<_> = avs.into_iter().enumerate().chunks(max_vars / 4);
|
||||
let chunks: itertools::IntoChunks<_> = avs.iter().enumerate().chunks(max_vars / 4);
|
||||
|
||||
// We'd like to `flat_map` here, but it's not obvious how to `flat_map` across `Result`.
|
||||
// Alternatively, this is a `fold`, and it might be wise to express it as such.
|
||||
|
@ -900,9 +898,8 @@ impl MentatStoring for rusqlite::Connection {
|
|||
let bindings_per_statement = 6;
|
||||
|
||||
let max_vars = self.limit(Limit::SQLITE_LIMIT_VARIABLE_NUMBER) as usize;
|
||||
let chunks: itertools::IntoChunks<_> = entities
|
||||
.into_iter()
|
||||
.chunks(max_vars / bindings_per_statement);
|
||||
let chunks: itertools::IntoChunks<_> =
|
||||
entities.iter().chunks(max_vars / bindings_per_statement);
|
||||
|
||||
// We'd like to flat_map here, but it's not obvious how to flat_map across Result.
|
||||
let results: Result<Vec<()>> = chunks.into_iter().map(|chunk| -> Result<()> {
|
||||
|
@ -973,9 +970,8 @@ impl MentatStoring for rusqlite::Connection {
|
|||
|
||||
let mut outer_searchid = 2000;
|
||||
|
||||
let chunks: itertools::IntoChunks<_> = entities
|
||||
.into_iter()
|
||||
.chunks(max_vars / bindings_per_statement);
|
||||
let chunks: itertools::IntoChunks<_> =
|
||||
entities.iter().chunks(max_vars / bindings_per_statement);
|
||||
|
||||
// From string to (searchid, value_type_tag).
|
||||
let mut seen: HashMap<ValueRc<String>, (i64, i32)> = HashMap::with_capacity(entities.len());
|
||||
|
@ -996,7 +992,7 @@ impl MentatStoring for rusqlite::Connection {
|
|||
u8 /* flags0 */,
|
||||
i64 /* searchid */)>> = chunk.map(|&(e, a, ref attribute, ref typed_value, added)| {
|
||||
match typed_value {
|
||||
&TypedValue::String(ref rc) => {
|
||||
TypedValue::String(ref rc) => {
|
||||
datom_count += 1;
|
||||
let entry = seen.entry(rc.clone());
|
||||
match entry {
|
||||
|
@ -1186,7 +1182,10 @@ pub fn update_metadata(
|
|||
// TODO: use concat! to avoid creating String instances.
|
||||
if !metadata_report.idents_altered.is_empty() {
|
||||
// Idents is the materialized view of the [entid :db/ident ident] slice of datoms.
|
||||
conn.execute(format!("DELETE FROM idents").as_str(), rusqlite::params![])?;
|
||||
conn.execute(
|
||||
"DELETE FROM idents".to_string().as_str(),
|
||||
rusqlite::params![],
|
||||
)?;
|
||||
conn.execute(
|
||||
format!(
|
||||
"INSERT INTO idents SELECT e, a, v, value_type_tag FROM datoms WHERE a IN {}",
|
||||
|
@ -1208,7 +1207,10 @@ pub fn update_metadata(
|
|||
|| !metadata_report.attributes_altered.is_empty()
|
||||
|| !metadata_report.idents_altered.is_empty()
|
||||
{
|
||||
conn.execute(format!("DELETE FROM schema").as_str(), rusqlite::params![])?;
|
||||
conn.execute(
|
||||
"DELETE FROM schema".to_string().as_str(),
|
||||
rusqlite::params![],
|
||||
)?;
|
||||
// NB: we're using :db/valueType as a placeholder for the entire schema-defining set.
|
||||
let s = format!(
|
||||
r#"
|
||||
|
|
|
@ -117,7 +117,7 @@ impl Datom {
|
|||
pub fn to_edn(&self) -> edn::Value {
|
||||
let f = |entid: &EntidOrIdent| -> edn::Value {
|
||||
match *entid {
|
||||
EntidOrIdent::Entid(ref y) => edn::Value::Integer(y.clone()),
|
||||
EntidOrIdent::Entid(ref y) => edn::Value::Integer(*y),
|
||||
EntidOrIdent::Ident(ref y) => edn::Value::Keyword(y.clone()),
|
||||
}
|
||||
};
|
||||
|
@ -134,13 +134,13 @@ impl Datom {
|
|||
|
||||
impl Datoms {
|
||||
pub fn to_edn(&self) -> edn::Value {
|
||||
edn::Value::Vector((&self.0).into_iter().map(|x| x.to_edn()).collect())
|
||||
edn::Value::Vector((&self.0).iter().map(|x| x.to_edn()).collect())
|
||||
}
|
||||
}
|
||||
|
||||
impl Transactions {
|
||||
pub fn to_edn(&self) -> edn::Value {
|
||||
edn::Value::Vector((&self.0).into_iter().map(|x| x.to_edn()).collect())
|
||||
edn::Value::Vector((&self.0).iter().map(|x| x.to_edn()).collect())
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -148,7 +148,7 @@ impl FulltextValues {
|
|||
pub fn to_edn(&self) -> edn::Value {
|
||||
edn::Value::Vector(
|
||||
(&self.0)
|
||||
.into_iter()
|
||||
.iter()
|
||||
.map(|&(x, ref y)| {
|
||||
edn::Value::Vector(vec![edn::Value::Integer(x), edn::Value::Text(y.clone())])
|
||||
})
|
||||
|
@ -238,7 +238,7 @@ pub fn datoms_after<S: Borrow<Schema>>(
|
|||
e: EntidOrIdent::Entid(e),
|
||||
a: to_entid(borrowed_schema, a),
|
||||
v: value,
|
||||
tx: tx,
|
||||
tx,
|
||||
added: None,
|
||||
}))
|
||||
})?
|
||||
|
@ -286,7 +286,7 @@ pub fn transactions_after<S: Borrow<Schema>>(
|
|||
e: EntidOrIdent::Entid(e),
|
||||
a: to_entid(borrowed_schema, a),
|
||||
v: value,
|
||||
tx: tx,
|
||||
tx,
|
||||
added: Some(added),
|
||||
})
|
||||
})?
|
||||
|
@ -332,12 +332,12 @@ pub fn dump_sql_query(
|
|||
let mut stmt: rusqlite::Statement = conn.prepare(sql)?;
|
||||
|
||||
let mut tw = TabWriter::new(Vec::new()).padding(2);
|
||||
write!(&mut tw, "{}\n", sql).unwrap();
|
||||
writeln!(&mut tw, "{}", sql).unwrap();
|
||||
|
||||
for column_name in stmt.column_names() {
|
||||
write!(&mut tw, "{}\t", column_name).unwrap();
|
||||
}
|
||||
write!(&mut tw, "\n").unwrap();
|
||||
writeln!(&mut tw).unwrap();
|
||||
|
||||
let r: Result<Vec<_>> = stmt
|
||||
.query_and_then(params, |row| {
|
||||
|
@ -345,7 +345,7 @@ pub fn dump_sql_query(
|
|||
let value: rusqlite::types::Value = row.get(i)?;
|
||||
write!(&mut tw, "{:?}\t", value).unwrap();
|
||||
}
|
||||
write!(&mut tw, "\n").unwrap();
|
||||
writeln!(&mut tw).unwrap();
|
||||
Ok(())
|
||||
})?
|
||||
.collect();
|
||||
|
@ -381,8 +381,9 @@ impl TestConn {
|
|||
I: Borrow<str>,
|
||||
{
|
||||
// Failure to parse the transaction is a coding error, so we unwrap.
|
||||
let entities = edn::parse::entities(transaction.borrow())
|
||||
.expect(format!("to be able to parse {} into entities", transaction.borrow()).as_str());
|
||||
let entities = edn::parse::entities(transaction.borrow()).unwrap_or_else(|_| {
|
||||
panic!("to be able to parse {} into entities", transaction.borrow())
|
||||
});
|
||||
|
||||
let details = {
|
||||
// The block scopes the borrow of self.sqlite.
|
||||
|
|
|
@ -86,7 +86,7 @@ impl TransactableValue for ValueAndSpan {
|
|||
.as_text()
|
||||
.cloned()
|
||||
.map(TempId::External)
|
||||
.map(|v| v.into())
|
||||
.map(|v| v)
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -117,7 +117,7 @@ impl TransactableValue for TypedValue {
|
|||
|
||||
fn as_tempid(&self) -> Option<TempId> {
|
||||
match self {
|
||||
&TypedValue::String(ref s) => Some(TempId::External((**s).clone()).into()),
|
||||
TypedValue::String(ref s) => Some(TempId::External((**s).clone())),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
|
|
@ -95,7 +95,7 @@ pub fn to_namespaced_keyword(s: &str) -> Result<symbols::Keyword> {
|
|||
_ => None,
|
||||
};
|
||||
|
||||
nsk.ok_or(DbErrorKind::NotYetImplemented(format!("InvalidKeyword: {}", s)).into())
|
||||
nsk.ok_or_else(|| DbErrorKind::NotYetImplemented(format!("InvalidKeyword: {}", s)).into())
|
||||
}
|
||||
|
||||
/// Prepare an SQL `VALUES` block, like (?, ?, ?), (?, ?, ?).
|
||||
|
|
|
@ -111,7 +111,7 @@ fn update_attribute_map_from_schema_retractions(
|
|||
let mut eas = BTreeMap::new();
|
||||
for (e, a, v) in retractions.into_iter() {
|
||||
if entids::is_a_schema_attribute(a) {
|
||||
eas.entry(e).or_insert(vec![]).push(a);
|
||||
eas.entry(e).or_insert_with(|| vec![]).push(a);
|
||||
suspect_retractions.push((e, a, v));
|
||||
} else {
|
||||
filtered_retractions.push((e, a, v));
|
||||
|
@ -145,7 +145,7 @@ fn update_attribute_map_from_schema_retractions(
|
|||
// Remove attributes corresponding to retracted attribute.
|
||||
attribute_map.remove(&e);
|
||||
} else {
|
||||
bail!(DbErrorKind::BadSchemaAssertion(format!("Retracting defining attributes of a schema without retracting its :db/ident is not permitted.")));
|
||||
bail!(DbErrorKind::BadSchemaAssertion("Retracting defining attributes of a schema without retracting its :db/ident is not permitted.".to_string()));
|
||||
}
|
||||
} else {
|
||||
filtered_retractions.push((e, a, v));
|
||||
|
@ -172,7 +172,7 @@ pub fn update_attribute_map_from_entid_triples(
|
|||
) -> AttributeBuilder {
|
||||
existing
|
||||
.get(&attribute_id)
|
||||
.map(AttributeBuilder::to_modify_attribute)
|
||||
.map(AttributeBuilder::modify_attribute)
|
||||
.unwrap_or_else(AttributeBuilder::default)
|
||||
}
|
||||
|
||||
|
@ -337,8 +337,8 @@ pub fn update_attribute_map_from_entid_triples(
|
|||
}
|
||||
|
||||
Ok(MetadataReport {
|
||||
attributes_installed: attributes_installed,
|
||||
attributes_altered: attributes_altered,
|
||||
attributes_installed,
|
||||
attributes_altered,
|
||||
idents_altered: BTreeMap::default(),
|
||||
})
|
||||
}
|
||||
|
@ -439,12 +439,12 @@ where
|
|||
// component_attributes up-to-date: most of the time we'll rebuild it
|
||||
// even though it's not necessary (e.g. a schema attribute that's _not_
|
||||
// a component was removed, or a non-component related attribute changed).
|
||||
if report.attributes_did_change() || ident_set.retracted.len() > 0 {
|
||||
if report.attributes_did_change() || !ident_set.retracted.is_empty() {
|
||||
schema.update_component_attributes();
|
||||
}
|
||||
|
||||
Ok(MetadataReport {
|
||||
idents_altered: idents_altered,
|
||||
idents_altered,
|
||||
..report
|
||||
})
|
||||
}
|
||||
|
|
|
@ -77,7 +77,7 @@ fn validate_attribute_map(entid_map: &EntidMap, attribute_map: &AttributeMap) ->
|
|||
entid_map
|
||||
.get(entid)
|
||||
.map(|ident| ident.to_string())
|
||||
.unwrap_or(entid.to_string())
|
||||
.unwrap_or_else(|| entid.to_string())
|
||||
};
|
||||
attribute.validate(ident)?;
|
||||
}
|
||||
|
@ -108,7 +108,7 @@ impl AttributeBuilder {
|
|||
|
||||
/// Make a new AttributeBuilder from an existing Attribute. This is important to allow
|
||||
/// retraction. Only attributes that we allow to change are duplicated here.
|
||||
pub fn to_modify_attribute(attribute: &Attribute) -> Self {
|
||||
pub fn modify_attribute(attribute: &Attribute) -> Self {
|
||||
let mut ab = AttributeBuilder::default();
|
||||
ab.multival = Some(attribute.multival);
|
||||
ab.unique = Some(attribute.unique);
|
||||
|
@ -116,22 +116,22 @@ impl AttributeBuilder {
|
|||
ab
|
||||
}
|
||||
|
||||
pub fn value_type<'a>(&'a mut self, value_type: ValueType) -> &'a mut Self {
|
||||
pub fn value_type(&mut self, value_type: ValueType) -> &mut Self {
|
||||
self.value_type = Some(value_type);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn multival<'a>(&'a mut self, multival: bool) -> &'a mut Self {
|
||||
pub fn multival(&mut self, multival: bool) -> &mut Self {
|
||||
self.multival = Some(multival);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn non_unique<'a>(&'a mut self) -> &'a mut Self {
|
||||
pub fn non_unique(&mut self) -> &mut Self {
|
||||
self.unique = Some(None);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn unique<'a>(&'a mut self, unique: attribute::Unique) -> &'a mut Self {
|
||||
pub fn unique(&mut self, unique: attribute::Unique) -> &mut Self {
|
||||
if self.helpful && unique == attribute::Unique::Identity {
|
||||
self.index = Some(true);
|
||||
}
|
||||
|
@ -139,12 +139,12 @@ impl AttributeBuilder {
|
|||
self
|
||||
}
|
||||
|
||||
pub fn index<'a>(&'a mut self, index: bool) -> &'a mut Self {
|
||||
pub fn index(&mut self, index: bool) -> &mut Self {
|
||||
self.index = Some(index);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn fulltext<'a>(&'a mut self, fulltext: bool) -> &'a mut Self {
|
||||
pub fn fulltext(&mut self, fulltext: bool) -> &mut Self {
|
||||
self.fulltext = Some(fulltext);
|
||||
if self.helpful && fulltext {
|
||||
self.index = Some(true);
|
||||
|
@ -152,12 +152,12 @@ impl AttributeBuilder {
|
|||
self
|
||||
}
|
||||
|
||||
pub fn component<'a>(&'a mut self, component: bool) -> &'a mut Self {
|
||||
pub fn component(&mut self, component: bool) -> &mut Self {
|
||||
self.component = Some(component);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn no_history<'a>(&'a mut self, no_history: bool) -> &'a mut Self {
|
||||
pub fn no_history(&mut self, no_history: bool) -> &mut Self {
|
||||
self.no_history = Some(no_history);
|
||||
self
|
||||
}
|
||||
|
@ -197,7 +197,7 @@ impl AttributeBuilder {
|
|||
attribute.multival = multival;
|
||||
}
|
||||
if let Some(ref unique) = self.unique {
|
||||
attribute.unique = unique.clone();
|
||||
attribute.unique = *unique;
|
||||
}
|
||||
if let Some(index) = self.index {
|
||||
attribute.index = index;
|
||||
|
@ -223,14 +223,12 @@ impl AttributeBuilder {
|
|||
|
||||
if let Some(ref unique) = self.unique {
|
||||
if *unique != attribute.unique {
|
||||
attribute.unique = unique.clone();
|
||||
mutations.push(AttributeAlteration::Unique);
|
||||
}
|
||||
} else {
|
||||
if attribute.unique != None {
|
||||
attribute.unique = None;
|
||||
attribute.unique = *unique;
|
||||
mutations.push(AttributeAlteration::Unique);
|
||||
}
|
||||
} else if attribute.unique != None {
|
||||
attribute.unique = None;
|
||||
mutations.push(AttributeAlteration::Unique);
|
||||
}
|
||||
|
||||
if let Some(index) = self.index {
|
||||
|
@ -272,17 +270,17 @@ pub trait SchemaBuilding {
|
|||
impl SchemaBuilding for Schema {
|
||||
fn require_ident(&self, entid: Entid) -> Result<&symbols::Keyword> {
|
||||
self.get_ident(entid)
|
||||
.ok_or(DbErrorKind::UnrecognizedEntid(entid).into())
|
||||
.ok_or_else(|| DbErrorKind::UnrecognizedEntid(entid).into())
|
||||
}
|
||||
|
||||
fn require_entid(&self, ident: &symbols::Keyword) -> Result<KnownEntid> {
|
||||
self.get_entid(&ident)
|
||||
.ok_or(DbErrorKind::UnrecognizedIdent(ident.to_string()).into())
|
||||
.ok_or_else(|| DbErrorKind::UnrecognizedIdent(ident.to_string()).into())
|
||||
}
|
||||
|
||||
fn require_attribute_for_entid(&self, entid: Entid) -> Result<&Attribute> {
|
||||
self.attribute_for_entid(entid)
|
||||
.ok_or(DbErrorKind::UnrecognizedEntid(entid).into())
|
||||
.ok_or_else(|| DbErrorKind::UnrecognizedEntid(entid).into())
|
||||
}
|
||||
|
||||
/// Create a valid `Schema` from the constituent maps.
|
||||
|
@ -290,10 +288,7 @@ impl SchemaBuilding for Schema {
|
|||
ident_map: IdentMap,
|
||||
attribute_map: AttributeMap,
|
||||
) -> Result<Schema> {
|
||||
let entid_map: EntidMap = ident_map
|
||||
.iter()
|
||||
.map(|(k, v)| (v.clone(), k.clone()))
|
||||
.collect();
|
||||
let entid_map: EntidMap = ident_map.iter().map(|(k, v)| (*v, k.clone())).collect();
|
||||
|
||||
validate_attribute_map(&entid_map, &attribute_map)?;
|
||||
Ok(Schema::new(ident_map, entid_map, attribute_map))
|
||||
|
@ -309,10 +304,10 @@ impl SchemaBuilding for Schema {
|
|||
.map(|(symbolic_ident, symbolic_attr, value)| {
|
||||
let ident: i64 = *ident_map
|
||||
.get(&symbolic_ident)
|
||||
.ok_or(DbErrorKind::UnrecognizedIdent(symbolic_ident.to_string()))?;
|
||||
.ok_or_else(|| DbErrorKind::UnrecognizedIdent(symbolic_ident.to_string()))?;
|
||||
let attr: i64 = *ident_map
|
||||
.get(&symbolic_attr)
|
||||
.ok_or(DbErrorKind::UnrecognizedIdent(symbolic_attr.to_string()))?;
|
||||
.ok_or_else(|| DbErrorKind::UnrecognizedIdent(symbolic_attr.to_string()))?;
|
||||
Ok((ident, attr, value))
|
||||
})
|
||||
.collect();
|
||||
|
|
|
@ -58,7 +58,7 @@ fn collect_ordered_txs_to_move(
|
|||
None => bail!(DbErrorKind::TimelinesInvalidRange),
|
||||
};
|
||||
|
||||
while let Some(t) = rows.next() {
|
||||
for t in rows {
|
||||
let t = t?;
|
||||
txs.push(t.0);
|
||||
if t.1 != timeline {
|
||||
|
@ -108,12 +108,13 @@ fn reversed_terms_for(
|
|||
tx_id: Entid,
|
||||
) -> Result<Vec<TermWithoutTempIds>> {
|
||||
let mut stmt = conn.prepare("SELECT e, a, v, value_type_tag, tx, added FROM timelined_transactions WHERE tx = ? AND timeline = ? ORDER BY tx DESC")?;
|
||||
let mut rows = stmt.query_and_then(
|
||||
let rows = stmt.query_and_then(
|
||||
&[&tx_id, &::TIMELINE_MAIN],
|
||||
|row| -> Result<TermWithoutTempIds> {
|
||||
let op = match row.get(5)? {
|
||||
true => OpType::Retract,
|
||||
false => OpType::Add,
|
||||
let op = if row.get(5)? {
|
||||
OpType::Retract
|
||||
} else {
|
||||
OpType::Add
|
||||
};
|
||||
Ok(Term::AddOrRetract(
|
||||
op,
|
||||
|
@ -126,7 +127,7 @@ fn reversed_terms_for(
|
|||
|
||||
let mut terms = vec![];
|
||||
|
||||
while let Some(row) = rows.next() {
|
||||
for row in rows {
|
||||
terms.push(row?);
|
||||
}
|
||||
Ok(terms)
|
||||
|
@ -141,9 +142,9 @@ pub fn move_from_main_timeline(
|
|||
new_timeline: Entid,
|
||||
) -> Result<(Option<Schema>, PartitionMap)> {
|
||||
if new_timeline == ::TIMELINE_MAIN {
|
||||
bail!(DbErrorKind::NotYetImplemented(format!(
|
||||
"Can't move transactions to main timeline"
|
||||
)));
|
||||
bail!(DbErrorKind::NotYetImplemented(
|
||||
"Can't move transactions to main timeline".to_string()
|
||||
));
|
||||
}
|
||||
|
||||
// We don't currently ensure that moving transactions onto a non-empty timeline
|
||||
|
|
53
db/src/tx.rs
53
db/src/tx.rs
|
@ -163,12 +163,12 @@ where
|
|||
tx_id: Entid,
|
||||
) -> Tx<'conn, 'a, W> {
|
||||
Tx {
|
||||
store: store,
|
||||
partition_map: partition_map,
|
||||
store,
|
||||
partition_map,
|
||||
schema_for_mutation: Cow::Borrowed(schema_for_mutation),
|
||||
schema: schema,
|
||||
watcher: watcher,
|
||||
tx_id: tx_id,
|
||||
schema,
|
||||
watcher,
|
||||
tx_id,
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -185,8 +185,8 @@ where
|
|||
|
||||
// Map [a v]->entid.
|
||||
let mut av_pairs: Vec<&AVPair> = vec![];
|
||||
for i in 0..temp_id_avs.len() {
|
||||
av_pairs.push(&temp_id_avs[i].1);
|
||||
for temp_id_av in temp_id_avs {
|
||||
av_pairs.push(&temp_id_av.1);
|
||||
}
|
||||
|
||||
// Lookup in the store.
|
||||
|
@ -208,14 +208,14 @@ where
|
|||
av_map.get(&av_pair)
|
||||
);
|
||||
if let Some(entid) = av_map.get(&av_pair).cloned().map(KnownEntid) {
|
||||
tempids.insert(tempid.clone(), entid).map(|previous| {
|
||||
if let Some(previous) = tempids.insert(tempid.clone(), entid) {
|
||||
if entid != previous {
|
||||
conflicting_upserts
|
||||
.entry((**tempid).clone())
|
||||
.or_insert_with(|| once(previous).collect::<BTreeSet<_>>())
|
||||
.insert(entid);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -340,7 +340,7 @@ where
|
|||
entmod::EntityPlace::TxFunction(ref tx_function) => {
|
||||
match tx_function.op.0.as_str() {
|
||||
"transaction-tx" => Ok(Either::Left(self.tx_id)),
|
||||
unknown @ _ => bail!(DbErrorKind::NotYetImplemented(format!(
|
||||
unknown => bail!(DbErrorKind::NotYetImplemented(format!(
|
||||
"Unknown transaction function {}",
|
||||
unknown
|
||||
))),
|
||||
|
@ -372,7 +372,7 @@ where
|
|||
) -> Result<KnownEntidOr<LookupRefOrTempId>> {
|
||||
match backward_a.unreversed() {
|
||||
None => {
|
||||
bail!(DbErrorKind::NotYetImplemented(format!("Cannot explode map notation value in :attr/_reversed notation for forward attribute")));
|
||||
bail!(DbErrorKind::NotYetImplemented("Cannot explode map notation value in :attr/_reversed notation for forward attribute".to_string()));
|
||||
}
|
||||
Some(forward_a) => {
|
||||
let forward_a = self.entity_a_into_term_a(forward_a)?;
|
||||
|
@ -412,7 +412,7 @@ where
|
|||
entmod::ValuePlace::TxFunction(ref tx_function) => {
|
||||
match tx_function.op.0.as_str() {
|
||||
"transaction-tx" => Ok(Either::Left(KnownEntid(self.tx_id.0))),
|
||||
unknown @ _ => bail!(DbErrorKind::NotYetImplemented(format!("Unknown transaction function {}", unknown))),
|
||||
unknown=> bail!(DbErrorKind::NotYetImplemented(format!("Unknown transaction function {}", unknown))),
|
||||
}
|
||||
},
|
||||
|
||||
|
@ -456,7 +456,7 @@ where
|
|||
op: OpType::Add,
|
||||
e: db_id.clone(),
|
||||
a: AttributePlace::Entid(a),
|
||||
v: v,
|
||||
v,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
@ -519,7 +519,7 @@ where
|
|||
entmod::ValuePlace::TxFunction(ref tx_function) => {
|
||||
let typed_value = match tx_function.op.0.as_str() {
|
||||
"transaction-tx" => TypedValue::Ref(self.tx_id),
|
||||
unknown @ _ => bail!(DbErrorKind::NotYetImplemented(format!(
|
||||
unknown => bail!(DbErrorKind::NotYetImplemented(format!(
|
||||
"Unknown transaction function {}",
|
||||
unknown
|
||||
))),
|
||||
|
@ -546,7 +546,7 @@ where
|
|||
|
||||
for vv in vs {
|
||||
deque.push_front(Entity::AddOrRetract {
|
||||
op: op.clone(),
|
||||
op,
|
||||
e: e.clone(),
|
||||
a: AttributePlace::Entid(entmod::EntidOrIdent::Entid(a)),
|
||||
v: vv,
|
||||
|
@ -667,8 +667,8 @@ where
|
|||
|term: TermWithTempIdsAndLookupRefs| -> Result<TermWithTempIds> {
|
||||
match term {
|
||||
Term::AddOrRetract(op, e, a, v) => {
|
||||
let e = replace_lookup_ref(&lookup_ref_map, e, |x| KnownEntid(x))?;
|
||||
let v = replace_lookup_ref(&lookup_ref_map, v, |x| TypedValue::Ref(x))?;
|
||||
let e = replace_lookup_ref(&lookup_ref_map, e, KnownEntid)?;
|
||||
let v = replace_lookup_ref(&lookup_ref_map, v, TypedValue::Ref)?;
|
||||
Ok(Term::AddOrRetract(op, e, a, v))
|
||||
}
|
||||
}
|
||||
|
@ -757,14 +757,14 @@ where
|
|||
for (tempid, entid) in temp_id_map {
|
||||
// Since `UpsertEV` instances always transition to `UpsertE` instances, it might be
|
||||
// that a tempid resolves in two generations, and those resolutions might conflict.
|
||||
tempids.insert((*tempid).clone(), entid).map(|previous| {
|
||||
if let Some(previous) = tempids.insert((*tempid).clone(), entid) {
|
||||
if entid != previous {
|
||||
conflicting_upserts
|
||||
.entry((*tempid).clone())
|
||||
.or_insert_with(|| once(previous).collect::<BTreeSet<_>>())
|
||||
.insert(entid);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if !conflicting_upserts.is_empty() {
|
||||
|
@ -891,10 +891,7 @@ where
|
|||
.map(|v| (true, v))
|
||||
.chain(ars.retract.into_iter().map(|v| (false, v)))
|
||||
{
|
||||
let op = match added {
|
||||
true => OpType::Add,
|
||||
false => OpType::Retract,
|
||||
};
|
||||
let op = if added { OpType::Add } else { OpType::Retract };
|
||||
self.watcher.datom(op, e, a, &v);
|
||||
queue.push((e, a, attribute, v, added));
|
||||
}
|
||||
|
@ -967,7 +964,7 @@ where
|
|||
Ok(TxReport {
|
||||
tx_id: self.tx_id,
|
||||
tx_instant,
|
||||
tempids: tempids,
|
||||
tempids,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
@ -1093,9 +1090,9 @@ where
|
|||
|
||||
let a_and_r = trie
|
||||
.entry((a, attribute))
|
||||
.or_insert(BTreeMap::default())
|
||||
.or_insert_with(BTreeMap::default)
|
||||
.entry(e)
|
||||
.or_insert(AddAndRetract::default());
|
||||
.or_insert_with(AddAndRetract::default);
|
||||
|
||||
match op {
|
||||
OpType::Add => a_and_r.add.insert(v),
|
||||
|
@ -1136,9 +1133,9 @@ fn get_or_insert_tx_instant<'schema>(
|
|||
entids::DB_TX_INSTANT,
|
||||
schema.require_attribute_for_entid(entids::DB_TX_INSTANT)?,
|
||||
))
|
||||
.or_insert(BTreeMap::default())
|
||||
.or_insert_with(BTreeMap::default)
|
||||
.entry(tx_id)
|
||||
.or_insert(AddAndRetract::default());
|
||||
.or_insert_with(AddAndRetract::default);
|
||||
if !ars.retract.is_empty() {
|
||||
// Cannot retract :db/txInstant!
|
||||
}
|
||||
|
|
|
@ -82,17 +82,18 @@ impl TxCommand {
|
|||
|
||||
impl Command for TxCommand {
|
||||
fn execute(&mut self) {
|
||||
self.observers.upgrade().map(|observers| {
|
||||
if let Some(observers) = self.observers.upgrade() {
|
||||
for (key, observer) in observers.iter() {
|
||||
let applicable_reports = observer.applicable_reports(&self.reports);
|
||||
if !applicable_reports.is_empty() {
|
||||
observer.notify(&key, applicable_reports);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
pub struct TxObservationService {
|
||||
observers: Arc<IndexMap<String, Arc<TxObserver>>>,
|
||||
executor: Option<Sender<Box<dyn Command + Send>>>,
|
||||
|
@ -107,7 +108,7 @@ impl TxObservationService {
|
|||
}
|
||||
|
||||
// For testing purposes
|
||||
pub fn is_registered(&self, key: &String) -> bool {
|
||||
pub fn is_registered(&self, key: &str) -> bool {
|
||||
self.observers.contains_key(key)
|
||||
}
|
||||
|
||||
|
@ -115,7 +116,7 @@ impl TxObservationService {
|
|||
Arc::make_mut(&mut self.observers).insert(key, observer);
|
||||
}
|
||||
|
||||
pub fn deregister(&mut self, key: &String) {
|
||||
pub fn deregister(&mut self, key: &str) {
|
||||
Arc::make_mut(&mut self.observers).remove(key);
|
||||
}
|
||||
|
||||
|
@ -154,6 +155,7 @@ impl Drop for TxObservationService {
|
|||
}
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
pub struct InProgressObserverTransactWatcher {
|
||||
collected_attributes: AttributeSet,
|
||||
pub txes: IndexMap<Entid, AttributeSet>,
|
||||
|
@ -174,8 +176,7 @@ impl TransactWatcher for InProgressObserverTransactWatcher {
|
|||
}
|
||||
|
||||
fn done(&mut self, t: &Entid, _schema: &Schema) -> Result<()> {
|
||||
let collected_attributes =
|
||||
::std::mem::replace(&mut self.collected_attributes, Default::default());
|
||||
let collected_attributes = ::std::mem::take(&mut self.collected_attributes);
|
||||
self.txes.insert(*t, collected_attributes);
|
||||
Ok(())
|
||||
}
|
||||
|
|
|
@ -127,8 +127,8 @@ pub struct DB {
|
|||
impl DB {
|
||||
pub fn new(partition_map: PartitionMap, schema: Schema) -> DB {
|
||||
DB {
|
||||
partition_map: partition_map,
|
||||
schema: schema,
|
||||
partition_map,
|
||||
schema,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -227,7 +227,7 @@ impl Generation {
|
|||
}
|
||||
|
||||
// Collect id->[a v] pairs that might upsert at this evolutionary step.
|
||||
pub(crate) fn temp_id_avs<'a>(&'a self) -> Vec<(TempIdHandle, AVPair)> {
|
||||
pub(crate) fn temp_id_avs(&self) -> Vec<(TempIdHandle, AVPair)> {
|
||||
let mut temp_id_avs: Vec<(TempIdHandle, AVPair)> = vec![];
|
||||
// TODO: map/collect.
|
||||
for &UpsertE(ref t, ref a, ref v) in &self.upserts_e {
|
||||
|
@ -269,32 +269,32 @@ impl Generation {
|
|||
|
||||
for term in self.allocations.iter() {
|
||||
match term {
|
||||
&Term::AddOrRetract(OpType::Add, Right(ref t1), a, Right(ref t2)) => {
|
||||
Term::AddOrRetract(OpType::Add, Right(ref t1), a, Right(ref t2)) => {
|
||||
temp_ids.insert(t1.clone());
|
||||
temp_ids.insert(t2.clone());
|
||||
let attribute: &Attribute = schema.require_attribute_for_entid(a)?;
|
||||
let attribute: &Attribute = schema.require_attribute_for_entid(*a)?;
|
||||
if attribute.unique == Some(attribute::Unique::Identity) {
|
||||
tempid_avs
|
||||
.entry((a, Right(t2.clone())))
|
||||
.or_insert(vec![])
|
||||
.entry((*a, Right(t2.clone())))
|
||||
.or_insert_with(|| vec![])
|
||||
.push(t1.clone());
|
||||
}
|
||||
}
|
||||
&Term::AddOrRetract(OpType::Add, Right(ref t), a, ref x @ Left(_)) => {
|
||||
Term::AddOrRetract(OpType::Add, Right(ref t), a, ref x @ Left(_)) => {
|
||||
temp_ids.insert(t.clone());
|
||||
let attribute: &Attribute = schema.require_attribute_for_entid(a)?;
|
||||
let attribute: &Attribute = schema.require_attribute_for_entid(*a)?;
|
||||
if attribute.unique == Some(attribute::Unique::Identity) {
|
||||
tempid_avs
|
||||
.entry((a, x.clone()))
|
||||
.or_insert(vec![])
|
||||
.entry((*a, x.clone()))
|
||||
.or_insert_with(|| vec![])
|
||||
.push(t.clone());
|
||||
}
|
||||
}
|
||||
&Term::AddOrRetract(OpType::Add, Left(_), _, Right(ref t)) => {
|
||||
Term::AddOrRetract(OpType::Add, Left(_), _, Right(ref t)) => {
|
||||
temp_ids.insert(t.clone());
|
||||
}
|
||||
&Term::AddOrRetract(OpType::Add, Left(_), _, Left(_)) => unreachable!(),
|
||||
&Term::AddOrRetract(OpType::Retract, _, _, _) => {
|
||||
Term::AddOrRetract(OpType::Add, Left(_), _, Left(_)) => unreachable!(),
|
||||
Term::AddOrRetract(OpType::Retract, _, _, _) => {
|
||||
// [:db/retract ...] entities never allocate entids; they have to resolve due to
|
||||
// other upserts (or they fail the transaction).
|
||||
}
|
||||
|
@ -319,13 +319,11 @@ impl Generation {
|
|||
);
|
||||
|
||||
for vs in tempid_avs.values() {
|
||||
vs.first()
|
||||
.and_then(|first| temp_ids.get(first))
|
||||
.map(|&first_index| {
|
||||
for tempid in vs {
|
||||
temp_ids.get(tempid).map(|&i| uf.union(first_index, i));
|
||||
}
|
||||
});
|
||||
if let Some(&first_index) = vs.first().and_then(|first| temp_ids.get(first)) {
|
||||
for tempid in vs {
|
||||
temp_ids.get(tempid).map(|&i| uf.union(first_index, i));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
debug!("union-find aggregation {:?}", uf.clone().into_labeling());
|
||||
|
|
177
docs/tutorial.md
Normal file
177
docs/tutorial.md
Normal file
|
@ -0,0 +1,177 @@
|
|||
# Introduction
|
||||
|
||||
Mentat is a transactional, relational storage system built on top of SQLite. The abstractions it offers allow you to easily tackle some things that are tricky in other storage systems:
|
||||
|
||||
- Have multiple components share storage and collaborate.
|
||||
- Evolve schema.
|
||||
- Track change over time.
|
||||
- Synchronize data correctly.
|
||||
- Store data with rich, checked types.
|
||||
|
||||
Mentat offers a programmatic Rust API for managing stores, retrieving data, and _transacting_ new data. It offers a Datalog-based query engine, with queries expressed in EDN, a rich textual data format similar to JSON. And it offers an EDN data format for transacting new data.
|
||||
|
||||
This tutorial covers all of these APIs, along with defining vocabulary.
|
||||
|
||||
We'll begin by introducing some concepts, and then we'll walk through some examples.
|
||||
|
||||
|
||||
## What does Mentat store?
|
||||
|
||||
Depending on your perspective, Mentat looks like a relational store, a graph store, or a tuple store.
|
||||
|
||||
Mentat stores relationships between _entities_ and other entities or _values_. An entity is related to other things by an _attribute_.
|
||||
|
||||
All entities have an _entity ID_ (abbreviated to _entid_).
|
||||
|
||||
Some entities additionally have an identifier called an _ident_, which is a keyword. That looks something like `:bookmark/title`.
|
||||
|
||||
A value is a primitive piece of data. Mentat supports the following:
|
||||
|
||||
* Strings
|
||||
* Long integers
|
||||
* Double-precision floating point numbers
|
||||
* Millisecond-precision timestamps
|
||||
* UUIDs
|
||||
* Booleans
|
||||
* Keywords (a special kind of string that we use for idents).
|
||||
|
||||
There are two special kinds of entities: _attributes_ and _transactions_.
|
||||
|
||||
Attributes are themselves entities with a particular set of properties that define their meaning. They have identifiers, so you can refer to them easily. They have a _value type_, which is the type of value Mentat expects to be on the right hand side of the relationship. And they have a _cardinality_ (whether one or many values can exist for a particular entity), whether values are _unique_, a documentation string, and some indexing options.
|
||||
|
||||
An attribute looks something like this:
|
||||
|
||||
```edn
|
||||
{:db/ident :bookmark/title
|
||||
:db/cardinality :db.cardinality/one
|
||||
:db/valueType :db.type/string
|
||||
:db/fulltext true
|
||||
:db/doc "The title of a bookmark."}
|
||||
```
|
||||
|
||||
Transactions are special entities that can be described however you wish. By default they track the timestamp at which they were written.
|
||||
|
||||
The relationship between an entity, an attribute, and a value, occurring in a _transaction_ (which is just another kind of entity!) — a tuple of five values — is called a _datom_.
|
||||
|
||||
A single datom might look something like this:
|
||||
|
||||
```
|
||||
[:db/add 65536 :bookmark/title "mozilla.org" 268435456]
|
||||
^ ^ ^ ^ ^
|
||||
\ Add or retract. | | | |
|
||||
\ The entity. | | |
|
||||
\ The attribute. | |
|
||||
\ The value, a string. |
|
||||
\ The transaction ID.
|
||||
```
|
||||
|
||||
which is equivalent to saying "in transaction 268435456 we assert that entity 65536 is a bookmark with the title 'mozilla.org'".
|
||||
|
||||
When we transact that — which means to add it as a fact to the store — Mentat also describes the transaction itself on our behalf:
|
||||
|
||||
```edn
|
||||
[:db/add 268435456 :db/txInstant "2018-01-25 20:07:04.408358 UTC" 268435456]
|
||||
```
|
||||
|
||||
# A simple app
|
||||
|
||||
Let's get started with some Rust code.
|
||||
|
||||
First, the imports we'll need. The comments here briefly explain what each thing is.
|
||||
|
||||
```rust
|
||||
// So you can define keywords with neater syntax.
|
||||
#[macro_use(kw)]
|
||||
extern crate mentat;
|
||||
|
||||
use mentat::{
|
||||
Store, // A single database connection and in-memory metadata.
|
||||
}
|
||||
|
||||
use mentat::vocabulary::attribute; // Properties of attributes.
|
||||
```
|
||||
|
||||
## Defining a simple vocabulary
|
||||
|
||||
All data in Mentat — even the terms we used above, like `:db/cardinality` — are defined in the store itself. So that's where we start. In Rust, we define a _vocabulary definition_, and ask the store to ensure that it exists.
|
||||
|
||||
```rust
|
||||
|
||||
fn set_up(mut store: Store) -> Result<()> {
|
||||
// Start a write transaction.
|
||||
let mut in_progress = store.begin_transaction()?;
|
||||
|
||||
// Make sure the core vocabulary exists. This is good practice if a user,
|
||||
// an external tool, or another component might have touched the file
|
||||
// since you last wrote it.
|
||||
in_progress.verify_core_schema()?;
|
||||
|
||||
// Make sure our vocabulary is installed, and install if necessary.
|
||||
// This defines some attributes that we can use to describe people.
|
||||
in_progress.ensure_vocabulary(&Definition {
|
||||
name: kw!(:example/people),
|
||||
version: 1,
|
||||
attributes: vec![
|
||||
(kw!(:person/name),
|
||||
vocabulary::AttributeBuilder::default()
|
||||
.value_type(ValueType::String)
|
||||
.multival(true)
|
||||
.build()),
|
||||
(kw!(:person/age),
|
||||
vocabulary::AttributeBuilder::default()
|
||||
.value_type(ValueType::Long)
|
||||
.multival(false)
|
||||
.build()),
|
||||
(kw!(:person/email),
|
||||
vocabulary::AttributeBuilder::default()
|
||||
.value_type(ValueType::String)
|
||||
.multival(true)
|
||||
.unique(attribute::Unique::Identity)
|
||||
.build()),
|
||||
],
|
||||
})?;
|
||||
|
||||
in_progress.commit()?;
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
We open a store and configure its vocabulary like this:
|
||||
|
||||
```rust
|
||||
let path = "/path/to/file.db";
|
||||
let store = Store::open(path)?;
|
||||
set_up(store)?;
|
||||
```
|
||||
|
||||
If this code returns successfully, we're good to go.
|
||||
|
||||
## Transactions
|
||||
|
||||
You'll see in our `set_up` function that we begin and end a transaction, which we call `in_progress`. A read-only transaction is begun via `begin_read`. The resulting objects — `InProgress` and `InProgressRead` support various kinds of read and write operations. Transactions are automatically rolled back when dropped, so remember to call `commit`!
|
||||
|
||||
## Adding some data
|
||||
|
||||
There are two ways to add data to Mentat: programmatically or textually.
|
||||
|
||||
The textual form accepts EDN, a simple relative of JSON that supports richer types and more flexible syntax. You saw this in the introduction. Here's an example:
|
||||
|
||||
```rust
|
||||
in_progress.transact(r#"[
|
||||
{:person/name "Alice"
|
||||
:person/age 32
|
||||
:person/email "alice@example.org"}
|
||||
]"#)?;
|
||||
```
|
||||
|
||||
You can implicitly _upsert_ data when you have a unique attribute to use:
|
||||
|
||||
```rust
|
||||
// Alice's age is now 33. Note that we don't need to find out an entid,
|
||||
// nor explicitly INSERT OR REPLACE or UPDATE OR INSERT or similar.
|
||||
in_progress.transact(r#"[
|
||||
{:person/age 33
|
||||
:person/email "alice@example.org"}
|
||||
]"#)?;
|
||||
```
|
||||
|
|
@ -49,8 +49,8 @@ impl TempId {
|
|||
impl fmt::Display for TempId {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> Result<(), fmt::Error> {
|
||||
match self {
|
||||
&TempId::External(ref s) => write!(f, "{}", s),
|
||||
&TempId::Internal(x) => write!(f, "<tempid {}>", x),
|
||||
TempId::External(ref s) => write!(f, "{}", s),
|
||||
TempId::Internal(x) => write!(f, "<tempid {}>", x),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -76,8 +76,8 @@ impl From<Keyword> for EntidOrIdent {
|
|||
impl EntidOrIdent {
|
||||
pub fn unreversed(&self) -> Option<EntidOrIdent> {
|
||||
match self {
|
||||
&EntidOrIdent::Entid(_) => None,
|
||||
&EntidOrIdent::Ident(ref a) => a.unreversed().map(EntidOrIdent::Ident),
|
||||
EntidOrIdent::Entid(_) => None,
|
||||
EntidOrIdent::Ident(ref a) => a.unreversed().map(EntidOrIdent::Ident),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -12,8 +12,8 @@ extern crate chrono;
|
|||
extern crate itertools;
|
||||
extern crate num;
|
||||
extern crate ordered_float;
|
||||
extern crate pretty;
|
||||
extern crate peg;
|
||||
extern crate pretty;
|
||||
extern crate uuid;
|
||||
|
||||
#[cfg(feature = "serde_support")]
|
||||
|
@ -50,13 +50,11 @@ pub use types::{
|
|||
|
||||
pub use symbols::{Keyword, NamespacedSymbol, PlainSymbol};
|
||||
|
||||
use std::collections::{BTreeSet, BTreeMap, LinkedList};
|
||||
use std::collections::{BTreeMap, BTreeSet, LinkedList};
|
||||
use std::f64::{INFINITY, NAN, NEG_INFINITY};
|
||||
use std::iter::FromIterator;
|
||||
use std::f64::{NAN, INFINITY, NEG_INFINITY};
|
||||
|
||||
use chrono::{
|
||||
TimeZone,
|
||||
};
|
||||
use chrono::TimeZone;
|
||||
|
||||
use entities::*;
|
||||
use query::FromValue;
|
||||
|
@ -126,7 +124,7 @@ peg::parser!(pub grammar parse() for str {
|
|||
// result = r#""foo\\bar""#
|
||||
// For the typical case, string_normal_chars will match multiple, leading to a single-element vec.
|
||||
pub rule raw_text() -> String = "\"" t:((string_special_char() / string_normal_chars())*) "\""
|
||||
{ t.join(&"").to_string() }
|
||||
{ t.join(&"") }
|
||||
|
||||
pub rule text() -> SpannedValue
|
||||
= v:raw_text() { SpannedValue::Text(v) }
|
||||
|
@ -150,8 +148,8 @@ peg::parser!(pub grammar parse() for str {
|
|||
rule inst_micros() -> DateTime<Utc> =
|
||||
"#instmicros" whitespace()+ d:$( digit()+ ) {
|
||||
let micros = d.parse::<i64>().unwrap();
|
||||
let seconds: i64 = micros / 1000000;
|
||||
let nanos: u32 = ((micros % 1000000).abs() as u32) * 1000;
|
||||
let seconds: i64 = micros / 1_000_000;
|
||||
let nanos: u32 = ((micros % 1_000_000).abs() as u32) * 1000;
|
||||
Utc.timestamp(seconds, nanos)
|
||||
}
|
||||
|
||||
|
@ -159,7 +157,7 @@ peg::parser!(pub grammar parse() for str {
|
|||
"#instmillis" whitespace()+ d:$( digit()+ ) {
|
||||
let millis = d.parse::<i64>().unwrap();
|
||||
let seconds: i64 = millis / 1000;
|
||||
let nanos: u32 = ((millis % 1000).abs() as u32) * 1000000;
|
||||
let nanos: u32 = ((millis % 1000).abs() as u32) * 1_000_000;
|
||||
Utc.timestamp(seconds, nanos)
|
||||
}
|
||||
|
||||
|
@ -351,7 +349,7 @@ peg::parser!(pub grammar parse() for str {
|
|||
= __ "*" __ { query::PullAttributeSpec::Wildcard }
|
||||
/ __ k:raw_forward_namespaced_keyword() __ alias:(":as" __ alias:raw_forward_keyword() __ { alias })? {
|
||||
let attribute = query::PullConcreteAttribute::Ident(::std::rc::Rc::new(k));
|
||||
let alias = alias.map(|alias| ::std::rc::Rc::new(alias));
|
||||
let alias = alias.map(::std::rc::Rc::new);
|
||||
query::PullAttributeSpec::Attribute(
|
||||
query::NamedPullAttribute {
|
||||
attribute,
|
||||
|
@ -525,4 +523,3 @@ peg::parser!(pub grammar parse() for str {
|
|||
/ v:variable() { query::Binding::BindScalar(v) }
|
||||
|
||||
});
|
||||
|
||||
|
|
|
@ -85,7 +85,7 @@ impl NamespaceableName {
|
|||
|
||||
NamespaceableName {
|
||||
components: dest,
|
||||
boundary: boundary,
|
||||
boundary,
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -144,7 +144,7 @@ impl NamespaceableName {
|
|||
}
|
||||
|
||||
#[inline]
|
||||
pub fn components<'a>(&'a self) -> (&'a str, &'a str) {
|
||||
pub fn components(&self) -> (&str, &str) {
|
||||
if self.boundary > 0 {
|
||||
(
|
||||
&self.components[0..self.boundary],
|
||||
|
@ -219,11 +219,11 @@ impl<'de> Deserialize<'de> for NamespaceableName {
|
|||
D: Deserializer<'de>,
|
||||
{
|
||||
let separated = SerializedNamespaceableName::deserialize(deserializer)?;
|
||||
if separated.name.len() == 0 {
|
||||
if separated.name.is_empty() {
|
||||
return Err(de::Error::custom("Empty name in keyword or symbol"));
|
||||
}
|
||||
if let Some(ns) = separated.namespace {
|
||||
if ns.len() == 0 {
|
||||
if ns.is_empty() {
|
||||
Err(de::Error::custom(
|
||||
"Empty but present namespace in keyword or symbol",
|
||||
))
|
||||
|
|
167
edn/src/query.rs
167
edn/src/query.rs
|
@ -51,10 +51,6 @@ impl Variable {
|
|||
self.0.as_ref().0.as_str()
|
||||
}
|
||||
|
||||
pub fn to_string(&self) -> String {
|
||||
self.0.as_ref().0.clone()
|
||||
}
|
||||
|
||||
pub fn name(&self) -> PlainSymbol {
|
||||
self.0.as_ref().clone()
|
||||
}
|
||||
|
@ -87,7 +83,7 @@ impl FromValue<Variable> for Variable {
|
|||
impl Variable {
|
||||
pub fn from_rc(sym: Rc<PlainSymbol>) -> Option<Variable> {
|
||||
if sym.is_var_symbol() {
|
||||
Some(Variable(sym.clone()))
|
||||
Some(Variable(sym))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
|
@ -246,18 +242,18 @@ impl FromValue<FnArg> for FnArg {
|
|||
impl std::fmt::Display for FnArg {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
|
||||
match self {
|
||||
&FnArg::Variable(ref var) => write!(f, "{}", var),
|
||||
&FnArg::SrcVar(ref var) => {
|
||||
FnArg::Variable(ref var) => write!(f, "{}", var),
|
||||
FnArg::SrcVar(ref var) => {
|
||||
if var == &SrcVar::DefaultSrc {
|
||||
write!(f, "$")
|
||||
} else {
|
||||
write!(f, "{:?}", var)
|
||||
}
|
||||
}
|
||||
&FnArg::EntidOrInteger(entid) => write!(f, "{}", entid),
|
||||
&FnArg::IdentOrKeyword(ref kw) => write!(f, "{}", kw),
|
||||
&FnArg::Constant(ref constant) => write!(f, "{:?}", constant),
|
||||
&FnArg::Vector(ref vec) => write!(f, "{:?}", vec),
|
||||
FnArg::EntidOrInteger(entid) => write!(f, "{}", entid),
|
||||
FnArg::IdentOrKeyword(ref kw) => write!(f, "{}", kw),
|
||||
FnArg::Constant(ref constant) => write!(f, "{:?}", constant),
|
||||
FnArg::Vector(ref vec) => write!(f, "{:?}", vec),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -265,7 +261,7 @@ impl std::fmt::Display for FnArg {
|
|||
impl FnArg {
|
||||
pub fn as_variable(&self) -> Option<&Variable> {
|
||||
match self {
|
||||
&FnArg::Variable(ref v) => Some(v),
|
||||
FnArg::Variable(ref v) => Some(v),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
@ -332,12 +328,10 @@ impl FromValue<PatternNonValuePlace> for PatternNonValuePlace {
|
|||
::SpannedValue::PlainSymbol(ref x) => {
|
||||
if x.0.as_str() == "_" {
|
||||
Some(PatternNonValuePlace::Placeholder)
|
||||
} else if let Some(v) = Variable::from_symbol(x) {
|
||||
Some(PatternNonValuePlace::Variable(v))
|
||||
} else {
|
||||
if let Some(v) = Variable::from_symbol(x) {
|
||||
Some(PatternNonValuePlace::Variable(v))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
None
|
||||
}
|
||||
}
|
||||
::SpannedValue::Keyword(ref x) => Some(x.clone().into()),
|
||||
|
@ -404,9 +398,9 @@ impl FromValue<PatternValuePlace> for PatternValuePlace {
|
|||
{
|
||||
Some(PatternValuePlace::Constant(x.clone().into()))
|
||||
}
|
||||
::SpannedValue::Uuid(ref u) => Some(PatternValuePlace::Constant(
|
||||
NonIntegerConstant::Uuid(u.clone()),
|
||||
)),
|
||||
::SpannedValue::Uuid(ref u) => {
|
||||
Some(PatternValuePlace::Constant(NonIntegerConstant::Uuid(*u)))
|
||||
}
|
||||
|
||||
// These don't appear in queries.
|
||||
::SpannedValue::Nil => None,
|
||||
|
@ -498,15 +492,15 @@ pub enum PullAttributeSpec {
|
|||
impl std::fmt::Display for PullConcreteAttribute {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
|
||||
match self {
|
||||
&PullConcreteAttribute::Ident(ref k) => write!(f, "{}", k),
|
||||
&PullConcreteAttribute::Entid(i) => write!(f, "{}", i),
|
||||
PullConcreteAttribute::Ident(ref k) => write!(f, "{}", k),
|
||||
PullConcreteAttribute::Entid(i) => write!(f, "{}", i),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::fmt::Display for NamedPullAttribute {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
|
||||
if let &Some(ref alias) = &self.alias {
|
||||
if let Some(ref alias) = self.alias {
|
||||
write!(f, "{} :as {}", self.attribute, alias)
|
||||
} else {
|
||||
write!(f, "{}", self.attribute)
|
||||
|
@ -517,8 +511,8 @@ impl std::fmt::Display for NamedPullAttribute {
|
|||
impl std::fmt::Display for PullAttributeSpec {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
|
||||
match self {
|
||||
&PullAttributeSpec::Wildcard => write!(f, "*"),
|
||||
&PullAttributeSpec::Attribute(ref attr) => write!(f, "{}", attr),
|
||||
PullAttributeSpec::Wildcard => write!(f, "*"),
|
||||
PullAttributeSpec::Attribute(ref attr) => write!(f, "{}", attr),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -553,10 +547,10 @@ impl Element {
|
|||
/// Returns true if the element must yield only one value.
|
||||
pub fn is_unit(&self) -> bool {
|
||||
match self {
|
||||
&Element::Variable(_) => false,
|
||||
&Element::Pull(_) => false,
|
||||
&Element::Aggregate(_) => true,
|
||||
&Element::Corresponding(_) => true,
|
||||
Element::Variable(_) => false,
|
||||
Element::Pull(_) => false,
|
||||
Element::Aggregate(_) => true,
|
||||
Element::Corresponding(_) => true,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -570,8 +564,8 @@ impl From<Variable> for Element {
|
|||
impl std::fmt::Display for Element {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
|
||||
match self {
|
||||
&Element::Variable(ref var) => write!(f, "{}", var),
|
||||
&Element::Pull(Pull {
|
||||
Element::Variable(ref var) => write!(f, "{}", var),
|
||||
Element::Pull(Pull {
|
||||
ref var,
|
||||
ref patterns,
|
||||
}) => {
|
||||
|
@ -581,12 +575,12 @@ impl std::fmt::Display for Element {
|
|||
}
|
||||
write!(f, "])")
|
||||
}
|
||||
&Element::Aggregate(ref agg) => match agg.args.len() {
|
||||
Element::Aggregate(ref agg) => match agg.args.len() {
|
||||
0 => write!(f, "({})", agg.func),
|
||||
1 => write!(f, "({} {})", agg.func, agg.args[0]),
|
||||
_ => write!(f, "({} {:?})", agg.func, agg.args),
|
||||
},
|
||||
&Element::Corresponding(ref var) => write!(f, "(the {})", var),
|
||||
Element::Corresponding(ref var) => write!(f, "(the {})", var),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -609,20 +603,15 @@ pub enum Limit {
|
|||
///
|
||||
/// ```rust
|
||||
/// # use edn::query::{Element, FindSpec, Variable};
|
||||
/// let elements = vec![
|
||||
/// Element::Variable(Variable::from_valid_name("?foo")),
|
||||
/// Element::Variable(Variable::from_valid_name("?bar")),
|
||||
/// ];
|
||||
/// let rel = FindSpec::FindRel(elements);
|
||||
///
|
||||
/// # fn main() {
|
||||
///
|
||||
/// let elements = vec![
|
||||
/// Element::Variable(Variable::from_valid_name("?foo")),
|
||||
/// Element::Variable(Variable::from_valid_name("?bar")),
|
||||
/// ];
|
||||
/// let rel = FindSpec::FindRel(elements);
|
||||
///
|
||||
/// if let FindSpec::FindRel(elements) = rel {
|
||||
/// assert_eq!(2, elements.len());
|
||||
/// }
|
||||
///
|
||||
/// # }
|
||||
/// if let FindSpec::FindRel(elements) = rel {
|
||||
/// assert_eq!(2, elements.len());
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
#[derive(Clone, Debug, Eq, PartialEq)]
|
||||
|
@ -649,19 +638,19 @@ impl FindSpec {
|
|||
pub fn is_unit_limited(&self) -> bool {
|
||||
use self::FindSpec::*;
|
||||
match self {
|
||||
&FindScalar(..) => true,
|
||||
&FindTuple(..) => true,
|
||||
&FindRel(..) => false,
|
||||
&FindColl(..) => false,
|
||||
FindScalar(..) => true,
|
||||
FindTuple(..) => true,
|
||||
FindRel(..) => false,
|
||||
FindColl(..) => false,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn expected_column_count(&self) -> usize {
|
||||
use self::FindSpec::*;
|
||||
match self {
|
||||
&FindScalar(..) => 1,
|
||||
&FindColl(..) => 1,
|
||||
&FindTuple(ref elems) | &FindRel(ref elems) => elems.len(),
|
||||
FindScalar(..) => 1,
|
||||
FindColl(..) => 1,
|
||||
FindTuple(ref elems) | &FindRel(ref elems) => elems.len(),
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -690,10 +679,10 @@ impl FindSpec {
|
|||
pub fn columns<'s>(&'s self) -> Box<dyn Iterator<Item = &Element> + 's> {
|
||||
use self::FindSpec::*;
|
||||
match self {
|
||||
&FindScalar(ref e) => Box::new(std::iter::once(e)),
|
||||
&FindColl(ref e) => Box::new(std::iter::once(e)),
|
||||
&FindTuple(ref v) => Box::new(v.iter()),
|
||||
&FindRel(ref v) => Box::new(v.iter()),
|
||||
FindScalar(ref e) => Box::new(std::iter::once(e)),
|
||||
FindColl(ref e) => Box::new(std::iter::once(e)),
|
||||
FindTuple(ref v) => Box::new(v.iter()),
|
||||
FindRel(ref v) => Box::new(v.iter()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -716,8 +705,8 @@ impl VariableOrPlaceholder {
|
|||
|
||||
pub fn var(&self) -> Option<&Variable> {
|
||||
match self {
|
||||
&VariableOrPlaceholder::Placeholder => None,
|
||||
&VariableOrPlaceholder::Variable(ref var) => Some(var),
|
||||
VariableOrPlaceholder::Placeholder => None,
|
||||
VariableOrPlaceholder::Variable(ref var) => Some(var),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -771,11 +760,11 @@ impl Binding {
|
|||
/// ```
|
||||
pub fn is_valid(&self) -> bool {
|
||||
match self {
|
||||
&Binding::BindScalar(_) | &Binding::BindColl(_) => true,
|
||||
&Binding::BindRel(ref vars) | &Binding::BindTuple(ref vars) => {
|
||||
Binding::BindScalar(_) | &Binding::BindColl(_) => true,
|
||||
Binding::BindRel(ref vars) | &Binding::BindTuple(ref vars) => {
|
||||
let mut acc = HashSet::<Variable>::new();
|
||||
for var in vars {
|
||||
if let &VariableOrPlaceholder::Variable(ref var) = var {
|
||||
if let VariableOrPlaceholder::Variable(ref var) = *var {
|
||||
if !acc.insert(var.clone()) {
|
||||
// It's invalid if there was an equal var already present in the set --
|
||||
// i.e., we have a duplicate var.
|
||||
|
@ -832,7 +821,7 @@ impl Pattern {
|
|||
entity: v_e,
|
||||
attribute: k.to_reversed().into(),
|
||||
value: e_v,
|
||||
tx: tx,
|
||||
tx,
|
||||
});
|
||||
} else {
|
||||
return None;
|
||||
|
@ -844,7 +833,7 @@ impl Pattern {
|
|||
entity: e,
|
||||
attribute: a,
|
||||
value: v,
|
||||
tx: tx,
|
||||
tx,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
@ -894,7 +883,7 @@ pub enum UnifyVars {
|
|||
impl WhereClause {
|
||||
pub fn is_pattern(&self) -> bool {
|
||||
match self {
|
||||
&WhereClause::Pattern(_) => true,
|
||||
WhereClause::Pattern(_) => true,
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
@ -909,8 +898,8 @@ pub enum OrWhereClause {
|
|||
impl OrWhereClause {
|
||||
pub fn is_pattern_or_patterns(&self) -> bool {
|
||||
match self {
|
||||
&OrWhereClause::Clause(WhereClause::Pattern(_)) => true,
|
||||
&OrWhereClause::And(ref clauses) => clauses.iter().all(|clause| clause.is_pattern()),
|
||||
OrWhereClause::Clause(WhereClause::Pattern(_)) => true,
|
||||
OrWhereClause::And(ref clauses) => clauses.iter().all(|clause| clause.is_pattern()),
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
@ -934,8 +923,8 @@ pub struct NotJoin {
|
|||
impl NotJoin {
|
||||
pub fn new(unify_vars: UnifyVars, clauses: Vec<WhereClause>) -> NotJoin {
|
||||
NotJoin {
|
||||
unify_vars: unify_vars,
|
||||
clauses: clauses,
|
||||
unify_vars,
|
||||
clauses,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1041,8 +1030,8 @@ impl ParsedQuery {
|
|||
Ok(ParsedQuery {
|
||||
find_spec: find_spec.ok_or("expected :find")?,
|
||||
default_source: SrcVar::DefaultSrc,
|
||||
with: with.unwrap_or(vec![]),
|
||||
in_vars: in_vars.unwrap_or(vec![]),
|
||||
with: with.unwrap_or_else(|| vec![]),
|
||||
in_vars: in_vars.unwrap_or_else(|| vec![]),
|
||||
in_sources: BTreeSet::default(),
|
||||
limit: limit.unwrap_or(Limit::None),
|
||||
where_clauses: where_clauses.ok_or("expected :where")?,
|
||||
|
@ -1054,8 +1043,8 @@ impl ParsedQuery {
|
|||
impl OrJoin {
|
||||
pub fn new(unify_vars: UnifyVars, clauses: Vec<OrWhereClause>) -> OrJoin {
|
||||
OrJoin {
|
||||
unify_vars: unify_vars,
|
||||
clauses: clauses,
|
||||
unify_vars,
|
||||
clauses,
|
||||
mentioned_vars: None,
|
||||
}
|
||||
}
|
||||
|
@ -1064,8 +1053,8 @@ impl OrJoin {
|
|||
/// every variable mentioned inside the join is also mentioned in the `UnifyVars` list.
|
||||
pub fn is_fully_unified(&self) -> bool {
|
||||
match &self.unify_vars {
|
||||
&UnifyVars::Implicit => true,
|
||||
&UnifyVars::Explicit(ref vars) => {
|
||||
UnifyVars::Implicit => true,
|
||||
UnifyVars::Explicit(ref vars) => {
|
||||
// We know that the join list must be a subset of the vars in the pattern, or
|
||||
// it would have failed validation. That allows us to simply compare counts here.
|
||||
// TODO: in debug mode, do a full intersection, and verify that our count check
|
||||
|
@ -1094,13 +1083,13 @@ impl ContainsVariables for WhereClause {
|
|||
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
|
||||
use self::WhereClause::*;
|
||||
match self {
|
||||
&OrJoin(ref o) => o.accumulate_mentioned_variables(acc),
|
||||
&Pred(ref p) => p.accumulate_mentioned_variables(acc),
|
||||
&Pattern(ref p) => p.accumulate_mentioned_variables(acc),
|
||||
&NotJoin(ref n) => n.accumulate_mentioned_variables(acc),
|
||||
&WhereFn(ref f) => f.accumulate_mentioned_variables(acc),
|
||||
&TypeAnnotation(ref a) => a.accumulate_mentioned_variables(acc),
|
||||
&RuleExpr => (),
|
||||
OrJoin(ref o) => o.accumulate_mentioned_variables(acc),
|
||||
Pred(ref p) => p.accumulate_mentioned_variables(acc),
|
||||
Pattern(ref p) => p.accumulate_mentioned_variables(acc),
|
||||
NotJoin(ref n) => n.accumulate_mentioned_variables(acc),
|
||||
WhereFn(ref f) => f.accumulate_mentioned_variables(acc),
|
||||
TypeAnnotation(ref a) => a.accumulate_mentioned_variables(acc),
|
||||
RuleExpr => (),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1109,12 +1098,12 @@ impl ContainsVariables for OrWhereClause {
|
|||
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
|
||||
use self::OrWhereClause::*;
|
||||
match self {
|
||||
&And(ref clauses) => {
|
||||
And(ref clauses) => {
|
||||
for clause in clauses {
|
||||
clause.accumulate_mentioned_variables(acc)
|
||||
}
|
||||
}
|
||||
&Clause(ref clause) => clause.accumulate_mentioned_variables(acc),
|
||||
Clause(ref clause) => clause.accumulate_mentioned_variables(acc),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -1161,7 +1150,7 @@ impl ContainsVariables for NotJoin {
|
|||
impl ContainsVariables for Predicate {
|
||||
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
|
||||
for arg in &self.args {
|
||||
if let &FnArg::Variable(ref v) = arg {
|
||||
if let FnArg::Variable(ref v) = *arg {
|
||||
acc_ref(acc, v)
|
||||
}
|
||||
}
|
||||
|
@ -1177,10 +1166,10 @@ impl ContainsVariables for TypeAnnotation {
|
|||
impl ContainsVariables for Binding {
|
||||
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
|
||||
match self {
|
||||
&Binding::BindScalar(ref v) | &Binding::BindColl(ref v) => acc_ref(acc, v),
|
||||
&Binding::BindRel(ref vs) | &Binding::BindTuple(ref vs) => {
|
||||
Binding::BindScalar(ref v) | &Binding::BindColl(ref v) => acc_ref(acc, v),
|
||||
Binding::BindRel(ref vs) | &Binding::BindTuple(ref vs) => {
|
||||
for v in vs {
|
||||
if let &VariableOrPlaceholder::Variable(ref v) = v {
|
||||
if let VariableOrPlaceholder::Variable(ref v) = *v {
|
||||
acc_ref(acc, v);
|
||||
}
|
||||
}
|
||||
|
@ -1192,7 +1181,7 @@ impl ContainsVariables for Binding {
|
|||
impl ContainsVariables for WhereFn {
|
||||
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
|
||||
for arg in &self.args {
|
||||
if let &FnArg::Variable(ref v) = arg {
|
||||
if let FnArg::Variable(ref v) = *arg {
|
||||
acc_ref(acc, v)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -130,7 +130,7 @@ impl NamespacedSymbol {
|
|||
}
|
||||
|
||||
#[inline]
|
||||
pub fn components<'a>(&'a self) -> (&'a str, &'a str) {
|
||||
pub fn components(&self) -> (&str, &str) {
|
||||
self.0.components()
|
||||
}
|
||||
}
|
||||
|
@ -180,7 +180,7 @@ impl Keyword {
|
|||
}
|
||||
|
||||
#[inline]
|
||||
pub fn components<'a>(&'a self) -> (&'a str, &'a str) {
|
||||
pub fn components(&self) -> (&str, &str) {
|
||||
self.0.components()
|
||||
}
|
||||
|
||||
|
|
|
@ -328,14 +328,14 @@ macro_rules! def_common_value_methods {
|
|||
|
||||
pub fn is_keyword(&self) -> bool {
|
||||
match self {
|
||||
&$t::Keyword(ref k) => !k.is_namespaced(),
|
||||
$t::Keyword(ref k) => !k.is_namespaced(),
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn is_namespaced_keyword(&self) -> bool {
|
||||
match self {
|
||||
&$t::Keyword(ref k) => k.is_namespaced(),
|
||||
$t::Keyword(ref k) => k.is_namespaced(),
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
@ -360,21 +360,21 @@ macro_rules! def_common_value_methods {
|
|||
|
||||
pub fn as_keyword(&self) -> Option<&symbols::Keyword> {
|
||||
match self {
|
||||
&$t::Keyword(ref k) => Some(k),
|
||||
$t::Keyword(ref k) => Some(k),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_plain_keyword(&self) -> Option<&symbols::Keyword> {
|
||||
match self {
|
||||
&$t::Keyword(ref k) if !k.is_namespaced() => Some(k),
|
||||
$t::Keyword(ref k) if !k.is_namespaced() => Some(k),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_namespaced_keyword(&self) -> Option<&symbols::Keyword> {
|
||||
match self {
|
||||
&$t::Keyword(ref k) if k.is_namespaced() => Some(k),
|
||||
$t::Keyword(ref k) if k.is_namespaced() => Some(k),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
|
|
@ -22,7 +22,7 @@ where
|
|||
T: Sized + Clone,
|
||||
{
|
||||
fn from_rc(val: Rc<T>) -> Self {
|
||||
val.clone()
|
||||
val
|
||||
}
|
||||
|
||||
fn from_arc(val: Arc<T>) -> Self {
|
||||
|
@ -45,7 +45,7 @@ where
|
|||
}
|
||||
|
||||
fn from_arc(val: Arc<T>) -> Self {
|
||||
val.clone()
|
||||
val
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -23,7 +23,11 @@ use num::traits::{One, Zero};
|
|||
use ordered_float::OrderedFloat;
|
||||
|
||||
use chrono::{TimeZone, Utc};
|
||||
use edn::{parse, symbols, types::{Span, SpannedValue, Value, ValueAndSpan}, utils, ParseError};
|
||||
use edn::{
|
||||
parse, symbols,
|
||||
types::{Span, SpannedValue, Value, ValueAndSpan},
|
||||
utils, ParseError,
|
||||
};
|
||||
|
||||
// Helper for making wrapped keywords with a namespace.
|
||||
fn k_ns(ns: &str, name: &str) -> Value {
|
||||
|
|
|
@ -12,7 +12,7 @@ use std; // To refer to std::result::Result.
|
|||
|
||||
use core_traits::{ValueType, ValueTypeSet};
|
||||
|
||||
use edn::{ ParseError, query::PlainSymbol };
|
||||
use edn::{query::PlainSymbol, ParseError};
|
||||
|
||||
pub type Result<T> = std::result::Result<T, AlgebrizerError>;
|
||||
|
||||
|
|
|
@ -116,7 +116,7 @@ impl ConjoiningClauses {
|
|||
|
||||
let constrained_types;
|
||||
if let Some(required) = self.required_types.get(var) {
|
||||
constrained_types = known_types.intersection(required);
|
||||
constrained_types = known_types.intersection(*required);
|
||||
} else {
|
||||
constrained_types = known_types;
|
||||
}
|
||||
|
|
|
@ -90,7 +90,7 @@ impl ConjoiningClauses {
|
|||
|
||||
let mut args = where_fn.args.into_iter();
|
||||
|
||||
// TODO: process source variables.
|
||||
// TODO(gburd): process source variables.
|
||||
match args.next().unwrap() {
|
||||
FnArg::SrcVar(SrcVar::DefaultSrc) => {}
|
||||
_ => bail!(AlgebrizerError::InvalidArgument(
|
||||
|
@ -104,7 +104,7 @@ impl ConjoiningClauses {
|
|||
|
||||
// TODO: accept placeholder and set of attributes. Alternately, consider putting the search
|
||||
// term before the attribute arguments and collect the (variadic) attributes into a set.
|
||||
// let a: Entid = self.resolve_attribute_argument(&where_fn.operator, 1, args.next().unwrap())?;
|
||||
// let a: Entid = self.resolve_attribute_argument(&where_fn.operator, 1, args.next().unwrap())?;
|
||||
//
|
||||
// TODO: improve the expression of this matching, possibly by using attribute_for_* uniformly.
|
||||
let a = match args.next().unwrap() {
|
||||
|
@ -117,7 +117,7 @@ impl ConjoiningClauses {
|
|||
match self.bound_value(&v) {
|
||||
Some(TypedValue::Ref(entid)) => Some(entid),
|
||||
Some(tv) => bail!(AlgebrizerError::InputTypeDisagreement(
|
||||
v.name().clone(),
|
||||
v.name(),
|
||||
ValueType::Ref,
|
||||
tv.value_type()
|
||||
)),
|
||||
|
@ -130,20 +130,13 @@ impl ConjoiningClauses {
|
|||
// An unknown ident, or an entity that isn't present in the store, or isn't a fulltext
|
||||
// attribute, is likely enough to be a coding error that we choose to bail instead of
|
||||
// marking the pattern as known-empty.
|
||||
let a = a.ok_or(AlgebrizerError::InvalidArgument(
|
||||
where_fn.operator.clone(),
|
||||
"attribute",
|
||||
1,
|
||||
))?;
|
||||
let attribute =
|
||||
schema
|
||||
.attribute_for_entid(a)
|
||||
.cloned()
|
||||
.ok_or(AlgebrizerError::InvalidArgument(
|
||||
where_fn.operator.clone(),
|
||||
"attribute",
|
||||
1,
|
||||
))?;
|
||||
let op = where_fn.operator.clone(); //TODO(gburd): remove me...
|
||||
let a = a.ok_or_else(move || AlgebrizerError::InvalidArgument(op, "attribute", 1))?;
|
||||
let op = where_fn.operator.clone(); //TODO(gburd): remove me...
|
||||
let attribute = schema
|
||||
.attribute_for_entid(a)
|
||||
.cloned()
|
||||
.ok_or_else(move || AlgebrizerError::InvalidArgument(op, "attribute", 1))?;
|
||||
|
||||
if !attribute.fulltext {
|
||||
// We can never get results from a non-fulltext attribute!
|
||||
|
@ -271,7 +264,7 @@ impl ConjoiningClauses {
|
|||
|
||||
self.bind_column_to_var(
|
||||
schema,
|
||||
fulltext_values_alias.clone(),
|
||||
fulltext_values_alias,
|
||||
Column::Fulltext(FulltextColumn::Text),
|
||||
var.clone(),
|
||||
);
|
||||
|
@ -284,12 +277,7 @@ impl ConjoiningClauses {
|
|||
return Ok(());
|
||||
}
|
||||
|
||||
self.bind_column_to_var(
|
||||
schema,
|
||||
datoms_table_alias.clone(),
|
||||
DatomsColumn::Tx,
|
||||
var.clone(),
|
||||
);
|
||||
self.bind_column_to_var(schema, datoms_table_alias, DatomsColumn::Tx, var.clone());
|
||||
}
|
||||
|
||||
if let VariableOrPlaceholder::Variable(ref var) = b_score {
|
||||
|
|
|
@ -47,7 +47,7 @@ impl ConjoiningClauses {
|
|||
|
||||
let named_values = ComputedTable::NamedValues {
|
||||
names: names.clone(),
|
||||
values: values,
|
||||
values,
|
||||
};
|
||||
|
||||
let table = self.computed_tables.push_computed(named_values);
|
||||
|
@ -103,13 +103,13 @@ impl ConjoiningClauses {
|
|||
if existing != value {
|
||||
self.mark_known_empty(EmptyBecause::ConflictingBindings {
|
||||
var: var.clone(),
|
||||
existing: existing.clone(),
|
||||
existing,
|
||||
desired: value,
|
||||
});
|
||||
return Ok(());
|
||||
}
|
||||
} else {
|
||||
self.bind_value(&var, value.clone());
|
||||
self.bind_value(&var, value);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
|
@ -180,7 +180,7 @@ impl ConjoiningClauses {
|
|||
.into_iter()
|
||||
.filter_map(|arg| -> Option<Result<TypedValue>> {
|
||||
// We need to get conversion errors out.
|
||||
// We also want to mark known-empty on impossibilty, but
|
||||
// We also want to mark known-empty on impossibility, but
|
||||
// still detect serious errors.
|
||||
match self.typed_value_from_arg(schema, &var, arg, known_types) {
|
||||
Ok(ValueConversion::Val(tv)) => {
|
||||
|
@ -188,7 +188,7 @@ impl ConjoiningClauses {
|
|||
&& !accumulated_types.is_unit()
|
||||
{
|
||||
// Values not all of the same type.
|
||||
Some(Err(AlgebrizerError::InvalidGroundConstant.into()))
|
||||
Some(Err(AlgebrizerError::InvalidGroundConstant))
|
||||
} else {
|
||||
Some(Ok(tv))
|
||||
}
|
||||
|
@ -198,7 +198,7 @@ impl ConjoiningClauses {
|
|||
skip = Some(because);
|
||||
None
|
||||
}
|
||||
Err(e) => Some(Err(e.into())),
|
||||
Err(e) => Some(Err(e)),
|
||||
}
|
||||
})
|
||||
.collect::<Result<Vec<TypedValue>>>()?;
|
||||
|
@ -211,7 +211,7 @@ impl ConjoiningClauses {
|
|||
|
||||
// Otherwise, we now have the values and the type.
|
||||
let types = vec![accumulated_types.exemplar().unwrap()];
|
||||
let names = vec![var.clone()];
|
||||
let names = vec![var];
|
||||
|
||||
self.collect_named_bindings(schema, names, types, values);
|
||||
Ok(())
|
||||
|
@ -227,8 +227,8 @@ impl ConjoiningClauses {
|
|||
let template: Vec<Option<(Variable, ValueTypeSet)>> = places
|
||||
.iter()
|
||||
.map(|x| match x {
|
||||
&VariableOrPlaceholder::Placeholder => None,
|
||||
&VariableOrPlaceholder::Variable(ref v) => {
|
||||
VariableOrPlaceholder::Placeholder => None,
|
||||
VariableOrPlaceholder::Variable(ref v) => {
|
||||
Some((v.clone(), self.known_type_set(v)))
|
||||
}
|
||||
})
|
||||
|
@ -271,7 +271,7 @@ impl ConjoiningClauses {
|
|||
// Convert each item in the row.
|
||||
// If any value in the row is impossible, then skip the row.
|
||||
// If all rows are impossible, fail the entire CC.
|
||||
if let &Some(ref pair) = pair {
|
||||
if let Some(ref pair) = pair {
|
||||
match self.typed_value_from_arg(schema, &pair.0, col, pair.1)? {
|
||||
ValueConversion::Val(tv) => vals.push(tv),
|
||||
ValueConversion::Impossible(because) => {
|
||||
|
|
|
@ -55,7 +55,7 @@ impl QueryInputs {
|
|||
.iter()
|
||||
.map(|(var, val)| (var.clone(), val.value_type()))
|
||||
.collect(),
|
||||
values: values,
|
||||
values,
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -73,9 +73,6 @@ impl QueryInputs {
|
|||
}
|
||||
}
|
||||
}
|
||||
Ok(QueryInputs {
|
||||
types: types,
|
||||
values: values,
|
||||
})
|
||||
Ok(QueryInputs { types, values })
|
||||
}
|
||||
}
|
||||
|
|
|
@ -147,8 +147,8 @@ pub struct ConjoiningClauses {
|
|||
/// A map from var to qualified columns. Used to project.
|
||||
pub column_bindings: BTreeMap<Variable, Vec<QualifiedAlias>>,
|
||||
|
||||
/// A list of variables mentioned in the enclosing query's :in clause. These must all be bound
|
||||
/// before the query can be executed. TODO: clarify what this means for nested CCs.
|
||||
/// A list of variables mentioned in the enclosing query's `:in` clause all of which must be
|
||||
/// bound before the query can be executed. TODO: clarify what this means for nested CCs.
|
||||
pub input_variables: BTreeSet<Variable>,
|
||||
|
||||
/// In some situations -- e.g., when a query is being run only once -- we know in advance the
|
||||
|
@ -279,7 +279,7 @@ impl ConjoiningClauses {
|
|||
values.keep_intersected_keys(&in_variables);
|
||||
|
||||
let mut cc = ConjoiningClauses {
|
||||
alias_counter: alias_counter,
|
||||
alias_counter,
|
||||
input_variables: in_variables,
|
||||
value_bindings: values,
|
||||
..Default::default()
|
||||
|
@ -301,14 +301,8 @@ impl ConjoiningClauses {
|
|||
impl ConjoiningClauses {
|
||||
pub(crate) fn derive_types_from_find_spec(&mut self, find_spec: &FindSpec) {
|
||||
for spec in find_spec.columns() {
|
||||
match spec {
|
||||
&Element::Pull(Pull {
|
||||
ref var,
|
||||
patterns: _,
|
||||
}) => {
|
||||
self.constrain_var_to_type(var.clone(), ValueType::Ref);
|
||||
}
|
||||
_ => {}
|
||||
if let Element::Pull(Pull { ref var, .. }) = spec {
|
||||
self.constrain_var_to_type(var.clone(), ValueType::Ref);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -410,7 +404,7 @@ impl ConjoiningClauses {
|
|||
self.known_types
|
||||
.get(var)
|
||||
.cloned()
|
||||
.unwrap_or(ValueTypeSet::any())
|
||||
.unwrap_or_else(ValueTypeSet::any)
|
||||
}
|
||||
|
||||
pub(crate) fn bind_column_to_var<C: Into<Column>>(
|
||||
|
@ -514,7 +508,7 @@ impl ConjoiningClauses {
|
|||
|
||||
self.column_bindings
|
||||
.entry(var)
|
||||
.or_insert(vec![])
|
||||
.or_insert_with(|| vec![])
|
||||
.push(alias);
|
||||
}
|
||||
|
||||
|
@ -585,10 +579,10 @@ impl ConjoiningClauses {
|
|||
these_types: ValueTypeSet,
|
||||
) -> Option<EmptyBecause> {
|
||||
if let Some(existing) = self.known_types.get(var) {
|
||||
if existing.intersection(&these_types).is_empty() {
|
||||
if existing.intersection(these_types).is_empty() {
|
||||
return Some(EmptyBecause::TypeMismatch {
|
||||
var: var.clone(),
|
||||
existing: existing.clone(),
|
||||
existing: *existing,
|
||||
desired: these_types,
|
||||
});
|
||||
}
|
||||
|
@ -640,7 +634,7 @@ impl ConjoiningClauses {
|
|||
// We have an existing requirement. The new requirement will be
|
||||
// the intersection, but we'll `mark_known_empty` if that's empty.
|
||||
let existing = *entry.get();
|
||||
let intersection = types.intersection(&existing);
|
||||
let intersection = types.intersection(existing);
|
||||
entry.insert(intersection);
|
||||
|
||||
if !intersection.is_empty() {
|
||||
|
@ -648,8 +642,8 @@ impl ConjoiningClauses {
|
|||
}
|
||||
|
||||
EmptyBecause::TypeMismatch {
|
||||
var: var,
|
||||
existing: existing,
|
||||
var,
|
||||
existing,
|
||||
desired: types,
|
||||
}
|
||||
}
|
||||
|
@ -684,7 +678,7 @@ impl ConjoiningClauses {
|
|||
panic!("Uh oh: we failed this pattern, probably because {:?} couldn't match, but now we're broadening its type.",
|
||||
e.key());
|
||||
}
|
||||
new = existing_types.union(&new_types);
|
||||
new = existing_types.union(new_types);
|
||||
}
|
||||
e.insert(new);
|
||||
}
|
||||
|
@ -710,11 +704,11 @@ impl ConjoiningClauses {
|
|||
e.insert(types);
|
||||
}
|
||||
Entry::Occupied(mut e) => {
|
||||
let intersected: ValueTypeSet = types.intersection(e.get());
|
||||
let intersected: ValueTypeSet = types.intersection(*e.get());
|
||||
if intersected.is_empty() {
|
||||
let reason = EmptyBecause::TypeMismatch {
|
||||
var: e.key().clone(),
|
||||
existing: e.get().clone(),
|
||||
existing: *e.get(),
|
||||
desired: types,
|
||||
};
|
||||
empty_because = Some(reason);
|
||||
|
@ -751,7 +745,7 @@ impl ConjoiningClauses {
|
|||
// If it's a variable, record that it has the right type.
|
||||
// Ident or attribute resolution errors (the only other check we need to do) will be done
|
||||
// by the caller.
|
||||
if let &EvolvedNonValuePlace::Variable(ref v) = value {
|
||||
if let EvolvedNonValuePlace::Variable(ref v) = value {
|
||||
self.constrain_var_to_type(v.clone(), ValueType::Ref)
|
||||
}
|
||||
}
|
||||
|
@ -784,12 +778,12 @@ impl ConjoiningClauses {
|
|||
) -> ::std::result::Result<DatomsTable, EmptyBecause> {
|
||||
if attribute.fulltext {
|
||||
match value {
|
||||
&EvolvedValuePlace::Placeholder => Ok(DatomsTable::Datoms), // We don't need the value.
|
||||
EvolvedValuePlace::Placeholder => Ok(DatomsTable::Datoms), // We don't need the value.
|
||||
|
||||
// TODO: an existing non-string binding can cause this pattern to fail.
|
||||
&EvolvedValuePlace::Variable(_) => Ok(DatomsTable::FulltextDatoms),
|
||||
EvolvedValuePlace::Variable(_) => Ok(DatomsTable::FulltextDatoms),
|
||||
|
||||
&EvolvedValuePlace::Value(TypedValue::String(_)) => Ok(DatomsTable::FulltextDatoms),
|
||||
EvolvedValuePlace::Value(TypedValue::String(_)) => Ok(DatomsTable::FulltextDatoms),
|
||||
|
||||
_ => {
|
||||
// We can't succeed if there's a non-string constant value for a fulltext
|
||||
|
@ -802,9 +796,9 @@ impl ConjoiningClauses {
|
|||
}
|
||||
}
|
||||
|
||||
fn table_for_unknown_attribute<'s, 'a>(
|
||||
fn table_for_unknown_attribute(
|
||||
&self,
|
||||
value: &'a EvolvedValuePlace,
|
||||
value: &EvolvedValuePlace,
|
||||
) -> ::std::result::Result<DatomsTable, EmptyBecause> {
|
||||
// If the value is known to be non-textual, we can simply use the regular datoms
|
||||
// table (TODO: and exclude on `index_fulltext`!).
|
||||
|
@ -817,7 +811,7 @@ impl ConjoiningClauses {
|
|||
Ok(match value {
|
||||
// TODO: see if the variable is projected, aggregated, or compared elsewhere in
|
||||
// the query. If it's not, we don't need to use all_datoms here.
|
||||
&EvolvedValuePlace::Variable(ref v) => {
|
||||
EvolvedValuePlace::Variable(ref v) => {
|
||||
// If `required_types` and `known_types` don't exclude strings,
|
||||
// we need to query `all_datoms`.
|
||||
if self
|
||||
|
@ -834,7 +828,7 @@ impl ConjoiningClauses {
|
|||
DatomsTable::Datoms
|
||||
}
|
||||
}
|
||||
&EvolvedValuePlace::Value(TypedValue::String(_)) => DatomsTable::AllDatoms,
|
||||
EvolvedValuePlace::Value(TypedValue::String(_)) => DatomsTable::AllDatoms,
|
||||
_ => DatomsTable::Datoms,
|
||||
})
|
||||
}
|
||||
|
@ -850,14 +844,14 @@ impl ConjoiningClauses {
|
|||
value: &'a EvolvedValuePlace,
|
||||
) -> ::std::result::Result<DatomsTable, EmptyBecause> {
|
||||
match attribute {
|
||||
&EvolvedNonValuePlace::Entid(id) => schema
|
||||
.attribute_for_entid(id)
|
||||
.ok_or_else(|| EmptyBecause::InvalidAttributeEntid(id))
|
||||
EvolvedNonValuePlace::Entid(id) => schema
|
||||
.attribute_for_entid(*id)
|
||||
.ok_or_else(|| EmptyBecause::InvalidAttributeEntid(*id))
|
||||
.and_then(|attribute| self.table_for_attribute_and_value(attribute, value)),
|
||||
// TODO: In a prepared context, defer this decision until a second algebrizing phase.
|
||||
// #278.
|
||||
&EvolvedNonValuePlace::Placeholder => self.table_for_unknown_attribute(value),
|
||||
&EvolvedNonValuePlace::Variable(ref v) => {
|
||||
EvolvedNonValuePlace::Placeholder => self.table_for_unknown_attribute(value),
|
||||
EvolvedNonValuePlace::Variable(ref v) => {
|
||||
// See if we have a binding for the variable.
|
||||
match self.bound_value(v) {
|
||||
// TODO: In a prepared context, defer this decision until a second algebrizing phase.
|
||||
|
@ -883,7 +877,7 @@ impl ConjoiningClauses {
|
|||
// attribute place.
|
||||
Err(EmptyBecause::InvalidBinding(
|
||||
Column::Fixed(DatomsColumn::Attribute),
|
||||
v.clone(),
|
||||
v,
|
||||
))
|
||||
}
|
||||
}
|
||||
|
@ -922,8 +916,8 @@ impl ConjoiningClauses {
|
|||
) -> Option<&'s Attribute> {
|
||||
match value {
|
||||
// We know this one is known if the attribute lookup succeeds…
|
||||
&TypedValue::Ref(id) => schema.attribute_for_entid(id),
|
||||
&TypedValue::Keyword(ref kw) => schema.attribute_for_ident(kw).map(|(a, _id)| a),
|
||||
TypedValue::Ref(id) => schema.attribute_for_entid(*id),
|
||||
TypedValue::Keyword(ref kw) => schema.attribute_for_ident(kw).map(|(a, _id)| a),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
@ -981,7 +975,7 @@ impl ConjoiningClauses {
|
|||
pub(crate) fn expand_column_bindings(&mut self) {
|
||||
for cols in self.column_bindings.values() {
|
||||
if cols.len() > 1 {
|
||||
let ref primary = cols[0];
|
||||
let primary = &cols[0];
|
||||
let secondaries = cols.iter().skip(1);
|
||||
for secondary in secondaries {
|
||||
// TODO: if both primary and secondary are .v, should we make sure
|
||||
|
@ -1029,18 +1023,18 @@ impl ConjoiningClauses {
|
|||
let mut empty_because: Option<EmptyBecause> = None;
|
||||
for (var, types) in self.required_types.clone().into_iter() {
|
||||
if let Some(already_known) = self.known_types.get(&var) {
|
||||
if already_known.is_disjoint(&types) {
|
||||
if already_known.is_disjoint(types) {
|
||||
// If we know the constraint can't be one of the types
|
||||
// the variable could take, then we know we're empty.
|
||||
empty_because = Some(EmptyBecause::TypeMismatch {
|
||||
var: var,
|
||||
var,
|
||||
existing: *already_known,
|
||||
desired: types,
|
||||
});
|
||||
break;
|
||||
}
|
||||
|
||||
if already_known.is_subset(&types) {
|
||||
if already_known.is_subset(types) {
|
||||
// TODO: I'm not convinced that we can do nothing here.
|
||||
//
|
||||
// Consider `[:find ?x ?v :where [_ _ ?v] [(> ?v 10)] [?x :foo/long ?v]]`.
|
||||
|
@ -1129,7 +1123,7 @@ impl ConjoiningClauses {
|
|||
}
|
||||
|
||||
fn mark_as_ref(&mut self, pos: &PatternNonValuePlace) {
|
||||
if let &PatternNonValuePlace::Variable(ref var) = pos {
|
||||
if let PatternNonValuePlace::Variable(ref var) = pos {
|
||||
self.constrain_var_to_type(var.clone(), ValueType::Ref)
|
||||
}
|
||||
}
|
||||
|
@ -1142,13 +1136,13 @@ impl ConjoiningClauses {
|
|||
// We apply (top level) type predicates first as an optimization.
|
||||
for clause in where_clauses.iter() {
|
||||
match clause {
|
||||
&WhereClause::TypeAnnotation(ref anno) => {
|
||||
WhereClause::TypeAnnotation(ref anno) => {
|
||||
self.apply_type_anno(anno)?;
|
||||
}
|
||||
|
||||
// Patterns are common, so let's grab as much type information from
|
||||
// them as we can.
|
||||
&WhereClause::Pattern(ref p) => {
|
||||
WhereClause::Pattern(ref p) => {
|
||||
self.mark_as_ref(&p.entity);
|
||||
self.mark_as_ref(&p.attribute);
|
||||
self.mark_as_ref(&p.tx);
|
||||
|
@ -1167,7 +1161,7 @@ impl ConjoiningClauses {
|
|||
let mut patterns: VecDeque<EvolvedPattern> = VecDeque::with_capacity(remaining);
|
||||
for clause in where_clauses {
|
||||
remaining -= 1;
|
||||
if let &WhereClause::TypeAnnotation(_) = &clause {
|
||||
if let WhereClause::TypeAnnotation(_) = &clause {
|
||||
continue;
|
||||
}
|
||||
match clause {
|
||||
|
|
|
@ -642,7 +642,7 @@ impl ConjoiningClauses {
|
|||
// For any variable which has an imprecise type anywhere in the UNION, add it to the
|
||||
// set that needs type extraction. All UNION arms must project the same columns.
|
||||
for var in projection.iter() {
|
||||
if acc.iter().any(|cc| !cc.known_type(var).is_some()) {
|
||||
if acc.iter().any(|cc| cc.known_type(var).is_none()) {
|
||||
type_needed.insert(var.clone());
|
||||
}
|
||||
}
|
||||
|
@ -672,7 +672,7 @@ impl ConjoiningClauses {
|
|||
}
|
||||
|
||||
let union = ComputedTable::Union {
|
||||
projection: projection,
|
||||
projection,
|
||||
type_extraction: type_needed,
|
||||
arms: acc,
|
||||
};
|
||||
|
@ -730,7 +730,7 @@ fn union_types(
|
|||
e.insert(new_types.clone());
|
||||
}
|
||||
Entry::Occupied(mut e) => {
|
||||
let new = e.get().union(&new_types);
|
||||
let new = e.get().union(*new_types);
|
||||
e.insert(new);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -8,6 +8,8 @@
|
|||
// CONDITIONS OF ANY KIND, either express or implied. See the License for the
|
||||
// specific language governing permissions and limitations under the License.
|
||||
|
||||
#![allow(clippy::single_match)]
|
||||
|
||||
use core_traits::{Entid, TypedValue, ValueType, ValueTypeSet};
|
||||
|
||||
use mentat_core::{Cloned, HasSchema};
|
||||
|
@ -27,7 +29,7 @@ use Known;
|
|||
|
||||
pub fn into_typed_value(nic: NonIntegerConstant) -> TypedValue {
|
||||
match nic {
|
||||
NonIntegerConstant::BigInteger(_) => unimplemented!(), // TODO: #280.
|
||||
NonIntegerConstant::BigInteger(_) => unimplemented!(), // TODO(gburd): #280.
|
||||
NonIntegerConstant::Boolean(v) => TypedValue::Boolean(v),
|
||||
NonIntegerConstant::Float(v) => TypedValue::Double(v),
|
||||
NonIntegerConstant::Text(v) => v.into(),
|
||||
|
@ -93,17 +95,15 @@ impl ConjoiningClauses {
|
|||
self.constrain_to_ref(&pattern.entity);
|
||||
self.constrain_to_ref(&pattern.attribute);
|
||||
|
||||
let ref col = alias.1;
|
||||
let col = &alias.1;
|
||||
|
||||
let schema = known.schema;
|
||||
match pattern.entity {
|
||||
EvolvedNonValuePlace::Placeholder =>
|
||||
// Placeholders don't contribute any column bindings, nor do
|
||||
// Placeholders don't contribute any column bindings, nor do
|
||||
// they constrain the query -- there's no need to produce
|
||||
// IS NOT NULL, because we don't store nulls in our schema.
|
||||
{
|
||||
()
|
||||
}
|
||||
{}
|
||||
EvolvedNonValuePlace::Variable(ref v) => {
|
||||
self.bind_column_to_var(schema, col.clone(), DatomsColumn::Entity, v.clone())
|
||||
}
|
||||
|
@ -287,7 +287,7 @@ impl ConjoiningClauses {
|
|||
None => {
|
||||
self.mark_known_empty(EmptyBecause::CachedAttributeHasNoEntity {
|
||||
value: val.clone(),
|
||||
attr: attr,
|
||||
attr,
|
||||
});
|
||||
true
|
||||
}
|
||||
|
@ -301,7 +301,7 @@ impl ConjoiningClauses {
|
|||
None => {
|
||||
self.mark_known_empty(EmptyBecause::CachedAttributeHasNoEntity {
|
||||
value: val.clone(),
|
||||
attr: attr,
|
||||
attr,
|
||||
});
|
||||
true
|
||||
}
|
||||
|
@ -403,8 +403,8 @@ impl ConjoiningClauses {
|
|||
None => {
|
||||
self.mark_known_empty(
|
||||
EmptyBecause::CachedAttributeHasNoValues {
|
||||
entity: entity,
|
||||
attr: attr,
|
||||
entity,
|
||||
attr,
|
||||
},
|
||||
);
|
||||
return true;
|
||||
|
@ -416,7 +416,7 @@ impl ConjoiningClauses {
|
|||
}
|
||||
}
|
||||
}
|
||||
_ => {} // TODO: check constant values against cache.
|
||||
_ => {} // TODO: check constant values against the cache.
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
|
@ -591,7 +591,7 @@ impl ConjoiningClauses {
|
|||
entity: e,
|
||||
attribute: a,
|
||||
value: v,
|
||||
tx: tx,
|
||||
tx,
|
||||
}),
|
||||
},
|
||||
},
|
||||
|
@ -612,7 +612,7 @@ impl ConjoiningClauses {
|
|||
let mut new_value: Option<EvolvedValuePlace> = None;
|
||||
|
||||
match &pattern.entity {
|
||||
&EvolvedNonValuePlace::Variable(ref var) => {
|
||||
EvolvedNonValuePlace::Variable(ref var) => {
|
||||
// See if we have it yet!
|
||||
match self.bound_value(&var) {
|
||||
None => (),
|
||||
|
@ -631,12 +631,12 @@ impl ConjoiningClauses {
|
|||
_ => (),
|
||||
}
|
||||
match &pattern.value {
|
||||
&EvolvedValuePlace::Variable(ref var) => {
|
||||
EvolvedValuePlace::Variable(ref var) => {
|
||||
// See if we have it yet!
|
||||
match self.bound_value(&var) {
|
||||
None => (),
|
||||
Some(tv) => {
|
||||
new_value = Some(EvolvedValuePlace::Value(tv.clone()));
|
||||
new_value = Some(EvolvedValuePlace::Value(tv));
|
||||
}
|
||||
};
|
||||
}
|
||||
|
@ -679,7 +679,6 @@ impl ConjoiningClauses {
|
|||
// between an attribute and a value.
|
||||
// We know we cannot return a result, so we short-circuit here.
|
||||
self.mark_known_empty(EmptyBecause::AttributeLookupFailed);
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -44,7 +44,7 @@ impl ConjoiningClauses {
|
|||
|
||||
fn potential_types(&self, schema: &Schema, fn_arg: &FnArg) -> Result<ValueTypeSet> {
|
||||
match fn_arg {
|
||||
&FnArg::Variable(ref v) => Ok(self.known_type_set(v)),
|
||||
FnArg::Variable(ref v) => Ok(self.known_type_set(v)),
|
||||
_ => fn_arg.potential_types(schema),
|
||||
}
|
||||
}
|
||||
|
@ -95,7 +95,7 @@ impl ConjoiningClauses {
|
|||
let supported_types = comparison.supported_types();
|
||||
let mut left_types = self
|
||||
.potential_types(known.schema, &left)?
|
||||
.intersection(&supported_types);
|
||||
.intersection(supported_types);
|
||||
if left_types.is_empty() {
|
||||
bail!(AlgebrizerError::InvalidArgumentType(
|
||||
predicate.operator.clone(),
|
||||
|
@ -106,7 +106,7 @@ impl ConjoiningClauses {
|
|||
|
||||
let mut right_types = self
|
||||
.potential_types(known.schema, &right)?
|
||||
.intersection(&supported_types);
|
||||
.intersection(supported_types);
|
||||
if right_types.is_empty() {
|
||||
bail!(AlgebrizerError::InvalidArgumentType(
|
||||
predicate.operator.clone(),
|
||||
|
@ -125,7 +125,7 @@ impl ConjoiningClauses {
|
|||
left_types.insert(ValueType::Double);
|
||||
}
|
||||
|
||||
let shared_types = left_types.intersection(&right_types);
|
||||
let shared_types = left_types.intersection(right_types);
|
||||
if shared_types.is_empty() {
|
||||
// In isolation these are both valid inputs to the operator, but the query cannot
|
||||
// succeed because the types don't match.
|
||||
|
@ -176,8 +176,8 @@ impl ConjoiningClauses {
|
|||
}
|
||||
|
||||
impl Inequality {
|
||||
fn to_constraint(&self, left: QueryValue, right: QueryValue) -> ColumnConstraint {
|
||||
match *self {
|
||||
fn to_constraint(self, left: QueryValue, right: QueryValue) -> ColumnConstraint {
|
||||
match self {
|
||||
Inequality::TxAfter | Inequality::TxBefore => {
|
||||
// TODO: both ends of the range must be inside the tx partition!
|
||||
// If we know the partition map -- and at this point we do, it's just
|
||||
|
@ -188,9 +188,9 @@ impl Inequality {
|
|||
}
|
||||
|
||||
ColumnConstraint::Inequality {
|
||||
operator: *self,
|
||||
left: left,
|
||||
right: right,
|
||||
operator: self,
|
||||
left,
|
||||
right,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -41,14 +41,14 @@ impl ConjoiningClauses {
|
|||
if v.value_type().is_numeric() {
|
||||
Ok(QueryValue::TypedValue(v))
|
||||
} else {
|
||||
bail!(AlgebrizerError::InputTypeDisagreement(var.name().clone(), ValueType::Long, v.value_type()))
|
||||
bail!(AlgebrizerError::InputTypeDisagreement(var.name(), ValueType::Long, v.value_type()))
|
||||
}
|
||||
} else {
|
||||
self.constrain_var_to_numeric(var.clone());
|
||||
self.column_bindings
|
||||
.get(&var)
|
||||
.and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone())))
|
||||
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()).into())
|
||||
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()))
|
||||
}
|
||||
},
|
||||
// Can't be an entid.
|
||||
|
@ -80,7 +80,7 @@ impl ConjoiningClauses {
|
|||
FnArg::Variable(var) => match self.bound_value(&var) {
|
||||
Some(TypedValue::Instant(v)) => Ok(QueryValue::TypedValue(TypedValue::Instant(v))),
|
||||
Some(v) => bail!(AlgebrizerError::InputTypeDisagreement(
|
||||
var.name().clone(),
|
||||
var.name(),
|
||||
ValueType::Instant,
|
||||
v.value_type()
|
||||
)),
|
||||
|
@ -89,7 +89,7 @@ impl ConjoiningClauses {
|
|||
self.column_bindings
|
||||
.get(&var)
|
||||
.and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone())))
|
||||
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()).into())
|
||||
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()))
|
||||
}
|
||||
},
|
||||
Constant(NonIntegerConstant::Instant(v)) => {
|
||||
|
@ -136,14 +136,14 @@ impl ConjoiningClauses {
|
|||
self.column_bindings
|
||||
.get(&var)
|
||||
.and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone())))
|
||||
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()).into())
|
||||
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()))
|
||||
}
|
||||
}
|
||||
EntidOrInteger(i) => Ok(QueryValue::TypedValue(TypedValue::Ref(i))),
|
||||
IdentOrKeyword(i) => schema
|
||||
.get_entid(&i)
|
||||
.map(|known_entid| QueryValue::Entid(known_entid.into()))
|
||||
.ok_or_else(|| AlgebrizerError::UnrecognizedIdent(i.to_string()).into()),
|
||||
.ok_or_else(|| AlgebrizerError::UnrecognizedIdent(i.to_string())),
|
||||
Constant(NonIntegerConstant::Boolean(_))
|
||||
| Constant(NonIntegerConstant::Float(_))
|
||||
| Constant(NonIntegerConstant::Text(_))
|
||||
|
@ -188,7 +188,7 @@ impl ConjoiningClauses {
|
|||
.column_bindings
|
||||
.get(&var)
|
||||
.and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone())))
|
||||
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()).into()),
|
||||
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name())),
|
||||
},
|
||||
EntidOrInteger(i) => Ok(QueryValue::PrimitiveLong(i)),
|
||||
IdentOrKeyword(_) => unimplemented!(), // TODO
|
||||
|
|
|
@ -122,7 +122,7 @@ impl ConjoiningClauses {
|
|||
known.schema,
|
||||
transactions.clone(),
|
||||
TransactionsColumn::Tx,
|
||||
tx_var.clone(),
|
||||
tx_var,
|
||||
);
|
||||
|
||||
let after_constraint = ColumnConstraint::Inequality {
|
||||
|
@ -138,7 +138,7 @@ impl ConjoiningClauses {
|
|||
let before_constraint = ColumnConstraint::Inequality {
|
||||
operator: Inequality::LessThan,
|
||||
left: QueryValue::Column(QualifiedAlias(
|
||||
transactions.clone(),
|
||||
transactions,
|
||||
Column::Transactions(TransactionsColumn::Tx),
|
||||
)),
|
||||
right: tx2,
|
||||
|
@ -306,7 +306,7 @@ impl ConjoiningClauses {
|
|||
|
||||
self.bind_column_to_var(
|
||||
known.schema,
|
||||
transactions.clone(),
|
||||
transactions,
|
||||
TransactionsColumn::Added,
|
||||
var.clone(),
|
||||
);
|
||||
|
|
|
@ -312,7 +312,7 @@ pub fn algebrize_with_inputs(
|
|||
cc.derive_types_from_find_spec(&parsed.find_spec);
|
||||
|
||||
// Do we have a variable limit? If so, tell the CC that the var must be numeric.
|
||||
if let &Limit::Variable(ref var) = &parsed.limit {
|
||||
if let Limit::Variable(ref var) = parsed.limit {
|
||||
cc.constrain_var_to_long(var.clone());
|
||||
}
|
||||
|
||||
|
@ -338,9 +338,9 @@ pub fn algebrize_with_inputs(
|
|||
has_aggregates: false, // TODO: we don't parse them yet.
|
||||
with: parsed.with,
|
||||
named_projection: extra_vars,
|
||||
order: order,
|
||||
limit: limit,
|
||||
cc: cc,
|
||||
order,
|
||||
limit,
|
||||
cc,
|
||||
};
|
||||
|
||||
// Substitute in any fixed values and fail if they're out of range.
|
||||
|
@ -364,7 +364,7 @@ impl FindQuery {
|
|||
in_vars: BTreeSet::default(),
|
||||
in_sources: BTreeSet::default(),
|
||||
limit: Limit::None,
|
||||
where_clauses: where_clauses,
|
||||
where_clauses,
|
||||
order: None,
|
||||
}
|
||||
}
|
||||
|
@ -417,5 +417,5 @@ impl FindQuery {
|
|||
pub fn parse_find_string(string: &str) -> Result<FindQuery> {
|
||||
parse_query(string)
|
||||
.map_err(|e| e.into())
|
||||
.and_then(|parsed| FindQuery::from_parsed_query(parsed))
|
||||
.and_then(FindQuery::from_parsed_query)
|
||||
}
|
||||
|
|
|
@ -153,8 +153,8 @@ impl ColumnName for DatomsColumn {
|
|||
impl ColumnName for VariableColumn {
|
||||
fn column_name(&self) -> String {
|
||||
match self {
|
||||
&VariableColumn::Variable(ref v) => v.to_string(),
|
||||
&VariableColumn::VariableTypeTag(ref v) => format!("{}_value_type_tag", v.as_str()),
|
||||
VariableColumn::Variable(ref v) => v.to_string(),
|
||||
VariableColumn::VariableTypeTag(ref v) => format!("{}_value_type_tag", v.as_str()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -163,8 +163,8 @@ impl Debug for VariableColumn {
|
|||
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
|
||||
match self {
|
||||
// These should agree with VariableColumn::column_name.
|
||||
&VariableColumn::Variable(ref v) => write!(f, "{}", v.as_str()),
|
||||
&VariableColumn::VariableTypeTag(ref v) => write!(f, "{}_value_type_tag", v.as_str()),
|
||||
VariableColumn::Variable(ref v) => write!(f, "{}", v.as_str()),
|
||||
VariableColumn::VariableTypeTag(ref v) => write!(f, "{}_value_type_tag", v.as_str()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -178,10 +178,10 @@ impl Debug for DatomsColumn {
|
|||
impl Debug for Column {
|
||||
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
|
||||
match self {
|
||||
&Column::Fixed(ref c) => c.fmt(f),
|
||||
&Column::Fulltext(ref c) => c.fmt(f),
|
||||
&Column::Variable(ref v) => v.fmt(f),
|
||||
&Column::Transactions(ref t) => t.fmt(f),
|
||||
Column::Fixed(ref c) => c.fmt(f),
|
||||
Column::Fulltext(ref c) => c.fmt(f),
|
||||
Column::Variable(ref v) => v.fmt(f),
|
||||
Column::Transactions(ref t) => t.fmt(f),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -298,10 +298,10 @@ impl Debug for QueryValue {
|
|||
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
|
||||
use self::QueryValue::*;
|
||||
match self {
|
||||
&Column(ref qa) => write!(f, "{:?}", qa),
|
||||
&Entid(ref entid) => write!(f, "entity({:?})", entid),
|
||||
&TypedValue(ref typed_value) => write!(f, "value({:?})", typed_value),
|
||||
&PrimitiveLong(value) => write!(f, "primitive({:?})", value),
|
||||
Column(ref qa) => write!(f, "{:?}", qa),
|
||||
Entid(ref entid) => write!(f, "entity({:?})", entid),
|
||||
TypedValue(ref typed_value) => write!(f, "value({:?})", typed_value),
|
||||
PrimitiveLong(value) => write!(f, "primitive({:?})", value),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -375,15 +375,15 @@ impl Inequality {
|
|||
}
|
||||
|
||||
// The built-in inequality operators apply to Long, Double, and Instant.
|
||||
pub fn supported_types(&self) -> ValueTypeSet {
|
||||
pub fn supported_types(self) -> ValueTypeSet {
|
||||
use self::Inequality::*;
|
||||
match self {
|
||||
&LessThan | &LessThanOrEquals | &GreaterThan | &GreaterThanOrEquals | &NotEquals => {
|
||||
LessThan | LessThanOrEquals | GreaterThan | GreaterThanOrEquals | NotEquals => {
|
||||
let mut ts = ValueTypeSet::of_numeric_types();
|
||||
ts.insert(ValueType::Instant);
|
||||
ts
|
||||
}
|
||||
&Unpermute | &Differ | &TxAfter | &TxBefore => ValueTypeSet::of_one(ValueType::Ref),
|
||||
Unpermute | Differ | TxAfter | TxBefore => ValueTypeSet::of_one(ValueType::Ref),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -392,17 +392,17 @@ impl Debug for Inequality {
|
|||
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
|
||||
use self::Inequality::*;
|
||||
f.write_str(match self {
|
||||
&LessThan => "<",
|
||||
&LessThanOrEquals => "<=",
|
||||
&GreaterThan => ">",
|
||||
&GreaterThanOrEquals => ">=",
|
||||
&NotEquals => "!=", // Datalog uses !=. SQL uses <>.
|
||||
LessThan => "<",
|
||||
LessThanOrEquals => "<=",
|
||||
GreaterThan => ">",
|
||||
GreaterThanOrEquals => ">=",
|
||||
NotEquals => "!=", // Datalog uses !=. SQL uses <>.
|
||||
|
||||
&Unpermute => "<",
|
||||
&Differ => "<>",
|
||||
Unpermute => "<",
|
||||
Differ => "<>",
|
||||
|
||||
&TxAfter => ">",
|
||||
&TxBefore => "<",
|
||||
TxAfter => ">",
|
||||
TxBefore => "<",
|
||||
})
|
||||
}
|
||||
}
|
||||
|
@ -534,17 +534,17 @@ impl Debug for ColumnConstraint {
|
|||
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
|
||||
use self::ColumnConstraint::*;
|
||||
match self {
|
||||
&Equals(ref qa1, ref thing) => write!(f, "{:?} = {:?}", qa1, thing),
|
||||
Equals(ref qa1, ref thing) => write!(f, "{:?} = {:?}", qa1, thing),
|
||||
|
||||
&Inequality {
|
||||
Inequality {
|
||||
operator,
|
||||
ref left,
|
||||
ref right,
|
||||
} => write!(f, "{:?} {:?} {:?}", left, operator, right),
|
||||
|
||||
&Matches(ref qa, ref thing) => write!(f, "{:?} MATCHES {:?}", qa, thing),
|
||||
Matches(ref qa, ref thing) => write!(f, "{:?} MATCHES {:?}", qa, thing),
|
||||
|
||||
&HasTypes {
|
||||
HasTypes {
|
||||
ref value,
|
||||
ref value_types,
|
||||
check_value,
|
||||
|
@ -553,7 +553,7 @@ impl Debug for ColumnConstraint {
|
|||
write!(f, "(")?;
|
||||
for value_type in value_types.iter() {
|
||||
write!(f, "({:?}.value_type_tag = {:?}", value, value_type)?;
|
||||
if check_value && value_type == ValueType::Double
|
||||
if *check_value && value_type == ValueType::Double
|
||||
|| value_type == ValueType::Long
|
||||
{
|
||||
write!(
|
||||
|
@ -573,7 +573,7 @@ impl Debug for ColumnConstraint {
|
|||
}
|
||||
write!(f, "1)")
|
||||
}
|
||||
&NotExists(ref ct) => write!(f, "NOT EXISTS {:?}", ct),
|
||||
NotExists(ref ct) => write!(f, "NOT EXISTS {:?}", ct),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -625,15 +625,15 @@ impl Debug for EmptyBecause {
|
|||
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
|
||||
use self::EmptyBecause::*;
|
||||
match self {
|
||||
&CachedAttributeHasNoEntity {
|
||||
CachedAttributeHasNoEntity {
|
||||
ref value,
|
||||
ref attr,
|
||||
} => write!(f, "(?e, {}, {:?}, _) not present in store", attr, value),
|
||||
&CachedAttributeHasNoValues {
|
||||
CachedAttributeHasNoValues {
|
||||
ref entity,
|
||||
ref attr,
|
||||
} => write!(f, "({}, {}, ?v, _) not present in store", entity, attr),
|
||||
&ConflictingBindings {
|
||||
ConflictingBindings {
|
||||
ref var,
|
||||
ref existing,
|
||||
ref desired,
|
||||
|
@ -642,7 +642,7 @@ impl Debug for EmptyBecause {
|
|||
"Var {:?} can't be {:?} because it's already bound to {:?}",
|
||||
var, desired, existing
|
||||
),
|
||||
&TypeMismatch {
|
||||
TypeMismatch {
|
||||
ref var,
|
||||
ref existing,
|
||||
ref desired,
|
||||
|
@ -651,7 +651,7 @@ impl Debug for EmptyBecause {
|
|||
"Type mismatch: {:?} can't be {:?}, because it's already {:?}",
|
||||
var, desired, existing
|
||||
),
|
||||
&KnownTypeMismatch {
|
||||
KnownTypeMismatch {
|
||||
ref left,
|
||||
ref right,
|
||||
} => write!(
|
||||
|
@ -659,25 +659,25 @@ impl Debug for EmptyBecause {
|
|||
"Type mismatch: {:?} can't be compared to {:?}",
|
||||
left, right
|
||||
),
|
||||
&NoValidTypes(ref var) => write!(f, "Type mismatch: {:?} has no valid types", var),
|
||||
&NonAttributeArgument => write!(f, "Non-attribute argument in attribute place"),
|
||||
&NonInstantArgument => write!(f, "Non-instant argument in instant place"),
|
||||
&NonEntityArgument => write!(f, "Non-entity argument in entity place"),
|
||||
&NonNumericArgument => write!(f, "Non-numeric argument in numeric place"),
|
||||
&NonStringFulltextValue => write!(f, "Non-string argument for fulltext attribute"),
|
||||
&UnresolvedIdent(ref kw) => write!(f, "Couldn't resolve keyword {}", kw),
|
||||
&InvalidAttributeIdent(ref kw) => write!(f, "{} does not name an attribute", kw),
|
||||
&InvalidAttributeEntid(entid) => write!(f, "{} is not an attribute", entid),
|
||||
&NonFulltextAttribute(entid) => write!(f, "{} is not a fulltext attribute", entid),
|
||||
&InvalidBinding(ref column, ref tv) => {
|
||||
NoValidTypes(ref var) => write!(f, "Type mismatch: {:?} has no valid types", var),
|
||||
NonAttributeArgument => write!(f, "Non-attribute argument in attribute place"),
|
||||
NonInstantArgument => write!(f, "Non-instant argument in instant place"),
|
||||
NonEntityArgument => write!(f, "Non-entity argument in entity place"),
|
||||
NonNumericArgument => write!(f, "Non-numeric argument in numeric place"),
|
||||
NonStringFulltextValue => write!(f, "Non-string argument for fulltext attribute"),
|
||||
UnresolvedIdent(ref kw) => write!(f, "Couldn't resolve keyword {}", kw),
|
||||
InvalidAttributeIdent(ref kw) => write!(f, "{} does not name an attribute", kw),
|
||||
InvalidAttributeEntid(entid) => write!(f, "{} is not an attribute", entid),
|
||||
NonFulltextAttribute(entid) => write!(f, "{} is not a fulltext attribute", entid),
|
||||
InvalidBinding(ref column, ref tv) => {
|
||||
write!(f, "{:?} cannot name column {:?}", tv, column)
|
||||
}
|
||||
&ValueTypeMismatch(value_type, ref typed_value) => write!(
|
||||
ValueTypeMismatch(value_type, ref typed_value) => write!(
|
||||
f,
|
||||
"Type mismatch: {:?} doesn't match attribute type {:?}",
|
||||
typed_value, value_type
|
||||
),
|
||||
&AttributeLookupFailed => write!(f, "Attribute lookup failed"),
|
||||
AttributeLookupFailed => write!(f, "Attribute lookup failed"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -158,8 +158,7 @@ pub struct SubvecIntoIterator<T> {
|
|||
}
|
||||
|
||||
impl<T> Iterator for SubvecIntoIterator<T> {
|
||||
// TODO: this is a good opportunity to use `SmallVec` instead: most queries
|
||||
// return a handful of columns.
|
||||
// TODO: this is a good opportunity to use `SmallVec` instead: most queries return a handful of columns.
|
||||
type Item = Vec<T>;
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
let result: Vec<_> = (&mut self.values).take(self.width).collect();
|
||||
|
|
|
@ -73,6 +73,7 @@ impl QueryFragment for () {
|
|||
}
|
||||
|
||||
/// A QueryBuilder that implements SQLite's specific escaping rules.
|
||||
#[derive(Default)]
|
||||
pub struct SQLiteQueryBuilder {
|
||||
pub sql: String,
|
||||
|
||||
|
@ -107,7 +108,7 @@ impl SQLiteQueryBuilder {
|
|||
|
||||
fn next_argument_name(&mut self) -> String {
|
||||
let arg = format!("{}{}", self.arg_prefix, self.arg_counter);
|
||||
self.arg_counter = self.arg_counter + 1;
|
||||
self.arg_counter += 1;
|
||||
arg
|
||||
}
|
||||
|
||||
|
@ -138,10 +139,10 @@ impl QueryBuilder for SQLiteQueryBuilder {
|
|||
fn push_typed_value(&mut self, value: &TypedValue) -> BuildQueryResult {
|
||||
use TypedValue::*;
|
||||
match value {
|
||||
&Ref(entid) => self.push_sql(entid.to_string().as_str()),
|
||||
&Boolean(v) => self.push_sql(if v { "1" } else { "0" }),
|
||||
&Long(v) => self.push_sql(v.to_string().as_str()),
|
||||
&Double(OrderedFloat(v)) => {
|
||||
Ref(entid) => self.push_sql(entid.to_string().as_str()),
|
||||
Boolean(v) => self.push_sql(if *v { "1" } else { "0" }),
|
||||
Long(v) => self.push_sql(v.to_string().as_str()),
|
||||
Double(OrderedFloat(v)) => {
|
||||
// Rust's floats print without a trailing '.' in some cases.
|
||||
// https://github.com/rust-lang/rust/issues/30967
|
||||
// We format with 'e' -- scientific notation -- so that SQLite treats them as
|
||||
|
@ -149,10 +150,10 @@ impl QueryBuilder for SQLiteQueryBuilder {
|
|||
// will currently (2017-06) always be 0, and need to round-trip as doubles.
|
||||
self.push_sql(format!("{:e}", v).as_str());
|
||||
}
|
||||
&Instant(dt) => {
|
||||
Instant(dt) => {
|
||||
self.push_sql(format!("{}", dt.to_micros()).as_str()); // TODO: argument instead?
|
||||
}
|
||||
&Uuid(ref u) => {
|
||||
Uuid(ref u) => {
|
||||
let bytes = u.as_bytes();
|
||||
if let Some(arg) = self.byte_args.get(bytes.as_ref()).cloned() {
|
||||
// Why, borrow checker, why?!
|
||||
|
@ -166,7 +167,7 @@ impl QueryBuilder for SQLiteQueryBuilder {
|
|||
// These are both `Rc`. Unfortunately, we can't use that fact when
|
||||
// turning these into rusqlite Values.
|
||||
// However, we can check to see whether there's an existing var that matches…
|
||||
&String(ref s) => {
|
||||
String(ref s) => {
|
||||
if let Some(arg) = self.string_args.get(s).cloned() {
|
||||
self.push_named_arg(arg.as_str());
|
||||
} else {
|
||||
|
@ -175,7 +176,7 @@ impl QueryBuilder for SQLiteQueryBuilder {
|
|||
self.string_args.insert(s.clone(), arg);
|
||||
}
|
||||
}
|
||||
&Keyword(ref s) => {
|
||||
Keyword(ref s) => {
|
||||
// TODO: intern.
|
||||
let v = Rc::new(rusqlite::types::Value::Text(s.as_ref().to_string()));
|
||||
self.push_static_arg(v);
|
||||
|
@ -233,7 +234,7 @@ impl QueryBuilder for SQLiteQueryBuilder {
|
|||
args.sort_by(|&(ref k1, _), &(ref k2, _)| k1.cmp(k2));
|
||||
SQLQuery {
|
||||
sql: self.sql,
|
||||
args: args,
|
||||
args,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -240,7 +240,7 @@ impl Conn {
|
|||
let tx = sqlite.transaction_with_behavior(behavior)?;
|
||||
let (current_generation, current_partition_map, current_schema, cache_cow) = {
|
||||
// The mutex is taken during this block.
|
||||
let ref current: Metadata = *self.metadata.lock().unwrap();
|
||||
let current: &Metadata = &(*self.metadata.lock().unwrap());
|
||||
(
|
||||
current.generation,
|
||||
// Expensive, but the partition map is updated after every committed transaction.
|
||||
|
@ -367,7 +367,7 @@ impl Conn {
|
|||
.register(key, observer);
|
||||
}
|
||||
|
||||
pub fn unregister_observer(&mut self, key: &String) {
|
||||
pub fn unregister_observer(&mut self, key: &str) {
|
||||
self.tx_observer_service.lock().unwrap().deregister(key);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -52,7 +52,7 @@ impl Store {
|
|||
let mut connection = crate::new_connection(path)?;
|
||||
let conn = Conn::connect(&mut connection)?;
|
||||
Ok(Store {
|
||||
conn: conn,
|
||||
conn,
|
||||
sqlite: connection,
|
||||
})
|
||||
}
|
||||
|
@ -100,7 +100,7 @@ impl Store {
|
|||
let mut connection = crate::new_connection_with_key(path, encryption_key)?;
|
||||
let conn = Conn::connect(&mut connection)?;
|
||||
Ok(Store {
|
||||
conn: conn,
|
||||
conn,
|
||||
sqlite: connection,
|
||||
})
|
||||
}
|
||||
|
|
|
@ -234,7 +234,7 @@ impl Definition {
|
|||
{
|
||||
Definition {
|
||||
name: name.into(),
|
||||
version: version,
|
||||
version,
|
||||
attributes: attributes.into(),
|
||||
pre: Definition::no_op,
|
||||
post: Definition::no_op,
|
||||
|
@ -730,8 +730,8 @@ impl<'a, 'c> VersionedStore for InProgress<'a, 'c> {
|
|||
}
|
||||
|
||||
c @ VocabularyCheck::NotPresent
|
||||
| c @ VocabularyCheck::PresentButNeedsUpdate { older_version: _ }
|
||||
| c @ VocabularyCheck::PresentButMissingAttributes { attributes: _ } => {
|
||||
| c @ VocabularyCheck::PresentButNeedsUpdate { .. }
|
||||
| c @ VocabularyCheck::PresentButMissingAttributes { .. } => {
|
||||
work.add(definition, c);
|
||||
}
|
||||
}
|
||||
|
@ -758,8 +758,7 @@ impl<'a, 'c> VersionedStore for InProgress<'a, 'c> {
|
|||
// Save this: we'll do it later.
|
||||
missing.push((definition, attributes));
|
||||
}
|
||||
VocabularyCheck::Present
|
||||
| VocabularyCheck::PresentButTooNew { newer_version: _ } => {
|
||||
VocabularyCheck::Present | VocabularyCheck::PresentButTooNew { .. } => {
|
||||
unreachable!();
|
||||
}
|
||||
}
|
||||
|
@ -815,9 +814,9 @@ impl SimpleVocabularySource {
|
|||
post: Option<fn(&mut InProgress<'_, '_>) -> Result<()>>,
|
||||
) -> SimpleVocabularySource {
|
||||
SimpleVocabularySource {
|
||||
pre: pre,
|
||||
post: post,
|
||||
definitions: definitions,
|
||||
pre,
|
||||
post,
|
||||
definitions,
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -910,8 +909,8 @@ where
|
|||
.collect();
|
||||
Ok(Some(Vocabulary {
|
||||
entity: entid.into(),
|
||||
version: version,
|
||||
attributes: attributes,
|
||||
version,
|
||||
attributes,
|
||||
}))
|
||||
}
|
||||
Some(_) => bail!(MentatError::InvalidVocabularyVersion),
|
||||
|
@ -982,7 +981,7 @@ where
|
|||
name.clone(),
|
||||
Vocabulary {
|
||||
entity: vocab,
|
||||
version: version,
|
||||
version,
|
||||
attributes: attrs,
|
||||
},
|
||||
)
|
||||
|
|
Loading…
Reference in a new issue