The Meta Executor

This Jupyter notebook is able to run other notebooks. When it runs them, it will NOT overwrite what was in the notebook, but it will update the outputs. These all function to save databases, tables, or figures. So, hypothetically, you could run all analyses without seeing the notebooks that ran them, and peruse the results in the all_figures.ipynb or all_tables.ipynb notebooks.

This notebook is also where I have centralized high-level documentation. To edit documentation, see the documentation.yaml file.

todos

  • fix the term consolidation. I want the top 30000 "terms" -- cooccurrences between words
    • in terms of frequency
    • in terms of chi2 non-independence with central citations
    • in terms of chi2 with each other (use cortext)
  • export demographic tables
  • journal trends figure
  • export figure with different kinds of deaths, consolidate multiple figures into "bigdeaths"
In [2]:
import sys; sys.path.append(_dh[0].split("knowknow")[0])
from knowknow import *
In [3]:
import papermill as pm
In [4]:
if True:
    import logging
    logging.basicConfig()
    logging.getLogger().setLevel(logging.INFO)
In [4]:
showdocs("counter")

Counting coocurrences

Cultural phenomena are rich in meaning and context. Moreover, the meaning and context are what we care about, so stripping that would be a disservice. "Consider Geertz:"

Not only is the semantic structure of the figure a good deal more complex than it appears on the surface, but an analysis of that structure forces one into tracing a multiplicity of referential connections between it and social reality, so that the final picture is one of a configuration of dissimilar meanings out of whose interworking both the expressive power and the rhetorical force of the final symbol derive. (Geertz [1955] 1973, Chapter 8 Ideology as a Cultural System, p. 213)

The way people understanding their world shape their action, and understandings are heterogeneous in any community, woven into a complex web of interacting pieces and parts. Understandings are constantly evolving, shifting with every conversation or Breaking News. Any quantitative technique for studying meaning must be able to capture the relational structure of cultural objects, their temporal dynamics, or it cannot be meaning.

These considerations motivate how I have designed the data structure and code for this project. My attention to "cooccurrences" in what follows is an application of Levi Martin and Lee's (2018) formal approach to meaning. They develop the symbolic formalism I use below, as well as showing several general analytic strategies for inductive, ground-up meaning-making from count data. This approach is quite general, useful for many applications.

The process is rather simple, I count cooccurrences between various attributes. For each document, for each citation in that document, I increment a dozen counters, depending on attributes of the citation, paper, journal, or author. This counting process is done once, and can be used as a compressed form of the dataset for all further analyses. In the terminology of Levi Martin and Lee, I am constructing "hypergraphs", and I will use their notation in what follows. For example $[c*fy]$ indicates the dataset which maps from $(c, fy) \to count$. $c$ is the name of the cited work. $fy$ is the publication year of the article which made the citation. $count$ is the number of citations which are at the intersection of these properties.

  • $[c]$ the number of citations each document receives
  • $[c*fj]$ the number of citations each document receives from each journal's articles
  • $[c*fy]$ the number of citations each document receives from each year's articles
  • $[fj]$ the number of citations from each journal
  • $[fj*fy]$ the number of citations in each journal in each year
  • $[t]$ cited term total counts
  • $[fy*t]$ cited term time series
  • term cooccurrence with citation and journal ($[c*t]$ and [fj*t]$)
  • "author" counts, the number of citations by each author ($[a]$ $[a*c]$ $[a*j*y]$)
  • [c*c]$, the cooccurrence network between citations
  • the death of citations can be studied using the $[c*fy]$ hypergraph
  • $[c*fj*t]$ could be used for analyzing differential associations of $c$ to $t$ across publication venues
  • $[ta*ta]$, $[fa*fa]$, $[t*t]$ and $[c*c]$ open the door to network-scientific methods

References

  • Martin, John Levi, and Monica Lee. 2018. “A Formal Approach to Meaning.” Poetics 68(February):10–17.
  • Geertz, Clifford. 1973. The Interpretation of Cultures. New York: Basic Books, Inc.
In [ ]:
# JSTOR counter

if False:
    jstor = Path(_dh[0]).joinpath('creating variables','jstor counter (cnt).ipynb')
    pm.execute_notebook(
        str(jstor),
        str(jstor),
        parameters = {},
        nest_asyncio=True
    )
In [ ]:
# WOS counter

if False:
    wos = Path(_dh[0]).joinpath('creating variables','web of science counter (cnt).ipynb')
    pm.execute_notebook(
        str(wos),
        str(wos),
        parameters = {},
        nest_asyncio=True
    )
In [8]:
showdocs("top1")

Zooming in on the top 1%

I would like to look at the most successful cited authors, cited works, and cited terms. Unfortunately, this isn't so simple. There has been a dramatic increase in the supply of citations over the last 100 years, so the group with the most total citations would be skewed towards the citation preferences of recent papers. In order to account for this bias,

I choose among items cited by articles published in each decade 1940-1950, 1941-1951, 1942-1952, all the way to 1980-1990. In each of these decades I determine which were the top-cited 1%. The set of all these top 1%s, from all these decade spans, comprise the 1% I will study in this paper.

In [28]:
# top1
ys = Path(_dh[0]).joinpath('creating variables','top percent cited in decade (top1).ipynb')

settings = [
    {"database_name":"sociology-wos","ctype":'c'},
    {"database_name":"sociology-wos","ctype":'ta'},
    {"database_name":"sociology-wos","ctype":'fa'},
    #{"database_name":"sociology-jstor-basicall","ctype":'t', }
]

for sett in settings:
    pm.execute_notebook(
        str(ys),
        str(ys),
        parameters = sett,
        nest_asyncio=True
    )
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\creating variables\top percent cited in decade (top1).ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\creating variables\top percent cited in decade (top1).ipynb
INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\creating variables\top percent cited in decade (top1).ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\creating variables\top percent cited in decade (top1).ipynb

INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\creating variables\top percent cited in decade (top1).ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\creating variables\top percent cited in decade (top1).ipynb

INFO:papermill:Executing notebook with kernel: python3

In [31]:
# ysum

ys = next(Path(_dh[0]).glob('creating variables/ysum*.ipynb'))

settings = [
    #{"database_name":"sociology-wos","dtype":'c'},
    #{"database_name":"sociology-wos","dtype":'ta'},
    #{"database_name":"sociology-wos","dtype":'fa'},
    {"database_name":"sociology-wos","dtype":'ffa'},
    #{"database_name":"sociology-jstor-basicall","dtype":'t'}
]

for sett in settings:
    pm.execute_notebook(
        str(ys),
        str(ys),
        parameters = sett,
        log_output=True,
        progress_bar=False,
        nest_asyncio=True
    )
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\creating variables\ysum - computing temporal summaries.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\creating variables\ysum - computing temporal summaries.ipynb
INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Executing Cell 1---------------------------------------
INFO:papermill:Ending Cell 1------------------------------------------
INFO:papermill:Executing Cell 2---------------------------------------
INFO:papermill:Ending Cell 2------------------------------------------
INFO:papermill:Executing Cell 3---------------------------------------
INFO:papermill:Ending Cell 3------------------------------------------
INFO:papermill:Executing Cell 4---------------------------------------
INFO:papermill:Ending Cell 4------------------------------------------
INFO:papermill:Executing Cell 5---------------------------------------
INFO:papermill:Ending Cell 5------------------------------------------
INFO:papermill:Executing Cell 6---------------------------------------
INFO:papermill:Pubyears loaded for 111731 entries

INFO:papermill:Ending Cell 6------------------------------------------
INFO:papermill:Executing Cell 7---------------------------------------
INFO:papermill:Ending Cell 7------------------------------------------
INFO:papermill:Executing Cell 8---------------------------------------
INFO:papermill:Ending Cell 8------------------------------------------
INFO:papermill:Executing Cell 9---------------------------------------
INFO:papermill:Ending Cell 9------------------------------------------
INFO:papermill:Executing Cell 10--------------------------------------
INFO:papermill:Ending Cell 10-----------------------------------------
INFO:papermill:Executing Cell 11--------------------------------------
INFO:papermill:Loaded keys: dict_keys(['ffa.fy', 'fy', 'ffa'])
Available keys: ['a', 'c', 'c.c', 'c.fj', 'c.fy', 'c.fy.j', 'fa', 'fa.c', 'fa.fj', 'fa.fj.fy', 'fa.fy', 'ffa', 'ffa.c', 'ffa.fj', 'ffa.fy', 'fj', 'fj.fy', 'fj.ta', 'fj.ty', 'fy', 'fy.ta', 'fy.ty', 'ta', 'ty', 'ty.ty']

INFO:papermill:Ending Cell 11-----------------------------------------
INFO:papermill:Executing Cell 12--------------------------------------
INFO:papermill:Ending Cell 12-----------------------------------------
INFO:papermill:Executing Cell 13--------------------------------------
INFO:papermill:Ending Cell 13-----------------------------------------
INFO:papermill:Executing Cell 14--------------------------------------
INFO:papermill:Ending Cell 14-----------------------------------------
INFO:papermill:Executing Cell 15--------------------------------------
INFO:papermill:Ending Cell 15-----------------------------------------
INFO:papermill:Executing Cell 16--------------------------------------
INFO:papermill:Ending Cell 16-----------------------------------------
INFO:papermill:Executing Cell 17--------------------------------------
INFO:papermill:Ending Cell 17-----------------------------------------
INFO:papermill:Executing Cell 18--------------------------------------
INFO:papermill:Ending Cell 18-----------------------------------------
INFO:papermill:Executing Cell 19--------------------------------------
INFO:papermill:Ending Cell 19-----------------------------------------
INFO:papermill:Executing Cell 20--------------------------------------
INFO:papermill:Ending Cell 20-----------------------------------------
INFO:papermill:Executing Cell 21--------------------------------------
INFO:papermill:Ending Cell 21-----------------------------------------
INFO:papermill:Executing Cell 22--------------------------------------
INFO:papermill:Ending Cell 22-----------------------------------------
INFO:papermill:Executing Cell 23--------------------------------------
INFO:papermill:Ending Cell 23-----------------------------------------
INFO:papermill:Executing Cell 24--------------------------------------
INFO:papermill:Ending Cell 24-----------------------------------------
INFO:papermill:Executing Cell 25--------------------------------------
INFO:papermill:Ending Cell 25-----------------------------------------
INFO:papermill:Executing Cell 26--------------------------------------
INFO:papermill:Processing database 'sociology-wos'
Item 0
defaultdict(<class 'int'>, {})

INFO:papermill:Item 5000

INFO:papermill:Item 10000
Item 15000

INFO:papermill:Item 20000

INFO:papermill:Item 25000

INFO:papermill:Item 30000

INFO:papermill:Item 35000

INFO:papermill:Ending Cell 26-----------------------------------------
INFO:papermill:Executing Cell 27--------------------------------------
INFO:papermill:Ending Cell 27-----------------------------------------
INFO:papermill:Executing Cell 28--------------------------------------
INFO:papermill:death_0 None
death_1 None
death_2 None
death_3 None
death_5 None
first 2011
last 2017
maxcount 2
maxcounty 2015
maxprop 0.0006754474839581223
maxpropy 2015
name hooghe, m
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 None
rebirth_5_5 None
total 6
totalprop 0.0020860875658478503
----------------------------
death_0 None
death_1 None
death_2 None
death_3 None
death_5 None
first 2011
last 2018
maxcount 2
maxcounty 2013
maxprop 0.0006963788300835655
maxpropy 2013
name gondal, n
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 None
rebirth_5_5 None
total 6
totalprop 0.00210948546559322
----------------------------
death_0 None
death_1 None
death_2 None
death_3 None
death_5 None
first 1995
last 2016
maxcount 3
maxcounty 2013
maxprop 0.0010445682451253482
maxpropy 2013
name franzen, a
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 None
rebirth_5_5 None
total 11
totalprop 0.00500903153824101
----------------------------
death_0 None
death_1 None
death_2 None
death_3 None
death_5 1995
first 1978
last 2015
maxcount 2
maxcounty 1990
maxprop 0.0015048908954100827
maxpropy 1990
name friedkin, n
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 2009
rebirth_5_5 2009
total 20
totalprop 0.014073701394404638
----------------------------
death_0 2000
death_1 2000
death_2 1994
death_3 1994
death_5 1990
first 1982
last 2019
maxcount 2
maxcounty 1989
maxprop 0.001444043321299639
maxpropy 1989
name wallace, m
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 2011
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 2011
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 2011
rebirth_5_5 None
total 10
totalprop 0.006331034301149077
----------------------------
death_0 None
death_1 None
death_2 None
death_3 None
death_5 None
first 2008
last 2019
maxcount 2
maxcounty 2014
maxprop 0.0006724949562878278
maxpropy 2014
name pais, j
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 None
rebirth_5_5 None
total 7
totalprop 0.0025622212558883807
----------------------------
death_0 None
death_1 None
death_2 None
death_3 None
death_5 None
first 2000
last 2012
maxcount 2
maxcounty 2011
maxprop 0.0012062726176115801
maxpropy 2001
name crowder, k
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 None
rebirth_5_5 None
total 11
totalprop 0.005646195842509818
----------------------------
death_0 None
death_1 None
death_2 None
death_3 None
death_5 None
first 2001
last 2015
maxcount 3
maxcounty 2011
maxprop 0.0017657445556209534
maxpropy 2007
name alon, s
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 None
rebirth_5_5 None
total 12
totalprop 0.0059347316201944856
----------------------------
death_0 None
death_1 None
death_2 None
death_3 None
death_5 None
first 1998
last 2016
maxcount 2
maxcounty 2005
maxprop 0.0011813349084465446
maxpropy 2005
name carbonaro, w
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 None
rebirth_5_5 None
total 8
totalprop 0.0039172848209811165
----------------------------
death_0 None
death_1 None
death_2 None
death_3 None
death_5 None
first 1990
last 2018
maxcount 2
maxcounty 2018
maxprop 0.0014705882352941176
maxpropy 1991
name sherkat, d
rebirth_0_10 None
rebirth_0_20 None
rebirth_0_3 None
rebirth_0_5 None
rebirth_1_10 None
rebirth_1_20 None
rebirth_1_3 None
rebirth_1_5 None
rebirth_2_10 None
rebirth_2_20 None
rebirth_2_3 None
rebirth_2_5 None
rebirth_3_10 None
rebirth_3_20 None
rebirth_3_3 None
rebirth_3_5 None
rebirth_5_10 None
rebirth_5_20 None
rebirth_5_3 None
rebirth_5_5 None
total 21
totalprop 0.012290534637626072
----------------------------

INFO:papermill:Ending Cell 28-----------------------------------------
INFO:papermill:Executing Cell 29--------------------------------------
INFO:papermill:Ending Cell 29-----------------------------------------
INFO:papermill:Executing Cell 30--------------------------------------
INFO:papermill:['death_0',
 'death_1',
 'death_2',
 'death_3',
 'death_5',
 'first',
 'last',
 'maxcount',
 'maxcounty',
 'maxprop',
 'maxpropy',
 'name',
 'rebirth_0_10',
 'rebirth_0_20',
 'rebirth_0_3',
 'rebirth_0_5',
 'rebirth_1_10',
 'rebirth_1_20',
 'rebirth_1_3',
 'rebirth_1_5',
 'rebirth_2_10',
 'rebirth_2_20',
 'rebirth_2_3',
 'rebirth_2_5',
 'rebirth_3_10',
 'rebirth_3_20',
 'rebirth_3_3',
 'rebirth_3_5',
 'rebirth_5_0',
 'rebirth_5_1',
 'rebirth_5_10',
 'rebirth_5_2',
 'rebirth_5_20',
 'rebirth_5_3',
 'rebirth_5_4',
 'rebirth_5_5',
 'rebirth_5_6',
 'rebirth_5_7',
 'rebirth_5_8',
 'rebirth_5_9',
 'total',
 'totalprop']
INFO:papermill:Ending Cell 30-----------------------------------------
INFO:papermill:Executing Cell 31--------------------------------------
INFO:papermill:159
INFO:papermill:Ending Cell 31-----------------------------------------
INFO:papermill:Executing Cell 32--------------------------------------
INFO:papermill:defaultdict(<class 'int'>, {'at least one citation': 39031, 'literally 1 citation. dropped.': 24226, 'passed tests pre-blacklist': 2911, 'never rise': 11894})

INFO:papermill:Ending Cell 32-----------------------------------------
INFO:papermill:Executing Cell 33--------------------------------------
INFO:papermill:Ending Cell 33-----------------------------------------
INFO:papermill:Executing Cell 34--------------------------------------
INFO:papermill:Ending Cell 34-----------------------------------------
INFO:papermill:Executing Cell 35--------------------------------------
INFO:papermill:Ending Cell 35-----------------------------------------
INFO:papermill:Executing Cell 36--------------------------------------
INFO:papermill:Ending Cell 36-----------------------------------------
INFO:papermill:Executing Cell 37--------------------------------------
INFO:papermill:Ending Cell 37-----------------------------------------

small analyses

In [5]:
# journal summaries
jsum = Path(_dh[0]).joinpath('analyses','summary of journals.ipynb')

pm.execute_notebook(
    str(jsum),
    str(jsum),
    parameters = {},
    nest_asyncio=True
);
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\summary of journals.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\summary of journals.ipynb
INFO:papermill:Executing notebook with kernel: python3

In [20]:
# momentary success
moment = Path(_dh[0]).joinpath('analyses','momentary success makes death less likely.ipynb')

settings = [
    {"database_name":"sociology-wos","dtype":'c'},
    {"database_name":"sociology-wos","dtype":'ta'},
    {"database_name":"sociology-wos","dtype":'fa'},
    #{"database_name":"sociology-jstor","dtype":'t'}
]

for sett in settings:
    pm.execute_notebook(
        str(moment),
        str(moment),
        parameters = sett,
        nest_asyncio=True
    )
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\momentary success makes death less likely.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\momentary success makes death less likely.ipynb
INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\momentary success makes death less likely.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\momentary success makes death less likely.ipynb

INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\momentary success makes death less likely.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\momentary success makes death less likely.ipynb

INFO:papermill:Executing notebook with kernel: python3

In [7]:
# ubiquitous power-law behavior
ys = Path(_dh[0]).joinpath('analyses','ubiquitous power-law behavior.ipynb')

settings = [
    {"database_name":"sociology-wos","dtype":'c'},
    {"database_name":"sociology-wos","dtype":'ta'},
    {"database_name":"sociology-jstor","dtype":'t'}
]

for sett in settings:
    pm.execute_notebook(
        str(ys),
        str(ys),
        parameters = sett,
        nest_asyncio=True
    )
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\ubiquitous power-law behavior.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\ubiquitous power-law behavior.ipynb
INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\ubiquitous power-law behavior.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\ubiquitous power-law behavior.ipynb

INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\ubiquitous power-law behavior.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\ubiquitous power-law behavior.ipynb

INFO:papermill:Executing notebook with kernel: python3

Demographics

In [5]:
# demographics of authors, context, etc

demo = Path(_dh[0]).joinpath('analyses','demographics.ipynb')

for i in [0,1,3,4]:#range(3):#[4]:#
    pm.execute_notebook(
        str(demo),
        str(demo),
        parameters = dict(setting_no=i),
        nest_asyncio=True
    )
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\demographics.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\demographics.ipynb
INFO:blib2to3.pgen2.driver:Generating grammar tables from c:\users\amcga\envs\citation-deaths\lib\site-packages\blib2to3\Grammar.txt
INFO:blib2to3.pgen2.driver:Writing grammar tables to C:\Users\amcga\AppData\Local\black\black\Cache\19.10b0\Grammar3.7.5.final.0.pickle
INFO:blib2to3.pgen2.driver:Writing failed: [Errno 2] No such file or directory: 'C:\\Users\\amcga\\AppData\\Local\\black\\black\\Cache\\19.10b0\\tmpf7bbsp_j'
INFO:blib2to3.pgen2.driver:Generating grammar tables from c:\users\amcga\envs\citation-deaths\lib\site-packages\blib2to3\PatternGrammar.txt
INFO:blib2to3.pgen2.driver:Writing grammar tables to C:\Users\amcga\AppData\Local\black\black\Cache\19.10b0\PatternGrammar3.7.5.final.0.pickle
INFO:blib2to3.pgen2.driver:Writing failed: [Errno 2] No such file or directory: 'C:\\Users\\amcga\\AppData\\Local\\black\\black\\Cache\\19.10b0\\tmpvpulmihs'
INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\demographics.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\demographics.ipynb

INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\demographics.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\demographics.ipynb

INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\demographics.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\demographics.ipynb

INFO:papermill:Executing notebook with kernel: python3

In [9]:
showdocs("100bigc")

Citations which were in the 1%, but have died

This figure shows a random 100 of these cited works. , in the sense of death2, death3, or death5. These deaths are labeled for reference.

In [29]:
# visualizations of remarkable lives and deaths

viz = Path(_dh[0]).joinpath('analyses','remarkable lives and deaths.ipynb')

settings = [
    {"database_name":"sociology-wos","dtype":'c',"birth_key":'first'},
    {"database_name":"sociology-wos","dtype":'ta',"birth_key":'first'},
    {"database_name":"sociology-wos","dtype":'fa',"birth_key":'first'}
]

for sett in settings:
    pm.execute_notebook(
        str(viz),
        str(viz),
        parameters = sett,
        nest_asyncio=True
    )
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\remarkable lives and deaths.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\remarkable lives and deaths.ipynb
INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\remarkable lives and deaths.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\remarkable lives and deaths.ipynb

INFO:papermill:Executing notebook with kernel: python3
INFO:papermill:Input Notebook:  G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\remarkable lives and deaths.ipynb
INFO:papermill:Output Notebook: G:\My Drive\projects\qualitative analysis of literature\post 5-12-2020\git repository _ citation-deaths\knowknow\analyses\remarkable lives and deaths.ipynb

INFO:papermill:Executing notebook with kernel: python3