Category Archives: statistics

A Look At The PC(USA) Church Dismissals In Alaska


A little under a year ago I did an analysis of some church dismissals from Tropical Florida and Mississippi Presbyteries of the Presbyterian Church (U.S.A.). In each presbytery multiple churches were dismissed permitting a statistical comparison of the sizes of those churches with the churches across the presbytery and the analysis found that the churches requesting dismissal were typically larger than the churches in the presbytery as a whole.

Now a similar situation has presented itself in the Presbytery of Alaska that allows me to once again go into statistical analysis mode.
 
The Presbytery web site contains this short news statement:

The Presbytery of Alaska met in Haines on April 5-7, 2013, and having concluded the processes set out in
“A GRACIOUS, PASTORAL RESONSE [sic] TO CHURCHES OF THE PRESBYTERY OF ALASKA REQUESTING DISAFFILIATION”
dismissed to the Covenant Order of Evangelical Presbyterian [sic] these churches: Kake, Angoon, Hoonah, Chapel by the Lake, Haines, and Skagway.

The Presbytery web site has been updated to list just the remaining nine churches.

At the one meeting this presbytery lost 2/5 of its 15 congregations. The question is whether this presbytery follows the previous pattern of church size distributions.

Here are the 15 churches’ membership numbers from their 2011 statistical reports.

Church  Location 2011 Membership
Remaining churches    
 First PC  Petersburg  39
 First PC  Sitka  73
 First PC  Wrangell  44
 First of Craig and Klawock  Craig  46
 Hydaburg PC  Hydaburg  28
 Ketchikan PC  Ketchikan  42
 Metlakatla PC  Metlakatla  40
 Northern Lights UPC  Juneau  99
 Yakutat PC  Yakutat  10
     
 Dismissed churches    
 Chapel by the Lake  Juneau  491
 First PC  Skagway  30
 Frances Johnson Memorial PC  Angoon  21
 Haines PC  Haines  63
 Hoonah PC  Hoonah  13
 Kake Memorial PC  Kake  14

Before the dismissals the Presbytery’s 15 congregations had 1053 members combined. Of that 421 members (40.0%) remain in the nine churches and 632 (60.0%) left in the six churches that were dismissed. The median size of the churches in the Presbytery before dismissal was 40 and after it is 42. The median size of the dismissed churches is 25.5.

So, the answer is that taken as a group the churches that requested dismissal to the Covenant Order of Evangelical Presbyterians are generally smaller than the churches remaining in the Presbytery. In other words the pattern we saw in Mississippi and Tropical Florida is not seen here in Alaska, but rather we find the reverse.

There is one pattern here that we have seen elsewhere – the departure of the largest church. While this did not happen in Tropical Florida – there the largest church requesting dismissal was the second largest church in the Presbytery – we did see in Mississippi that the two largest churches departed. We are seeing in other presbyteries the largest church requesting dismissal but my more comprehensive analysis of that is still in the works. In Alaska, the largest church in the Presbytery was dismissed and it has a membership almost five times larger than the second largest church. In fact the membership of Chapel by the Lake represented 46.6% of the Presbytery’s church membership before dismissals and 77.7% of the membership that was dismissed. (And because this one data point has such a large value is the reason I have so far not mentioned the statical mean of the data.)

Looking a bit further at the data we see that the second and third smallest churches were also dismissed, contributing to the median size of the dismissed churches being below those that remain.

Just out of curiosity, if we drop the large outlier from the data set we find that there are 562 members in all the other churches with 40.1 members as the mean size of a church and 39.5 the median. For the five smaller churches that were dismissed there are 141 members (25.1%) and the remaining churches have 421 members (74.9%). These five departing churches have a mean size of 28.2 and a median of 21. The remaining churches have an average size of 46.8 and a median of 42.

All this to say that in this case, while the largest church in the Presbytery of Alaska was among those being dismissed, overall the churches that requested dismissal to ECO were generally smaller churches in the Presbytery.

I have not done the necessary research on these churches to have formulated a good theory as to why this reverse pattern is present in this presbytery. Part of the reason that this area may have significantly different dynamics is because of the isolation of each of these communities and therefore church choices are very limited. This is in contrast to areas with larger populations and better transportation networks where perspective members can church shop for a congregation that meets their long list of interests and preferences. Only in Juneau were there two Presbyterian churches in the same city. For the others, even if two churches were on the same island, travel between was by sea or air — no driving between the communities. There is generally no choosing between two Presbyterian churches with different styles or theological perspective.

For the polity geeks I will mention that with the Presbytery of Alaska dropping to nine congregations, they are now below the minimum of ten required for a presbytery. The Layman reports that while the Presbytery continues to be administered as it has been the Synod of Alaska-Northwest has assumed jurisdiction.

So, an interesting data set but one that may not be representative of other parts of the country. As other data sets get larger we will see what they look like.

Running The Numbers — Dismissals From Tropical Florida And Mississippi Presbyteries


Over the last couple of weeks a big deal has been made about how the recent dismissals of churches in the Southeastern US have removed about one-third of the members out of a couple of presbyteries. Some examples of this media include the articles on the Layman site (Florida, Mississippi) and an article on the Christianity Today site. Well, I decided to drill down into the data a bit.

First my data set: The latest from the PC(USA) are the 2010 comparative statistics and the congregational reporting also for 2012. I looked up the stats on each congregation in each presbytery and used Table 4 from the Comparative Statistics as a comparison. The list from the PC(USA) Find A Congregation when searched by presbytery was compared to the list each presbytery has posted of their churches (Florida, Mississippi). At the time I ran the numbers the dismissed churches still appeared on each list. Interestingly, the PC(USA) list by presbytery misses churches in each presbytery (First Pompano Beach in Tropical Florida and First Pascagoula and Vernal Presbyterian in Mississippi). In addition, the PC(USA) list includes Wiggins Presbyterian in Mississippi which no longer appears on the presbytery’s list. Finally, one church in Tropical Florida, Korean Central Presbyterian Church, has no data in the PC(USA) statistics. Working through all these differences does result in a list of churches that agrees in number with Table 4.

I checked with Jason Reagan, the Layman reporter who wrote the articles, and he confirmed that the numbers in his articles are current numbers supplied by the churches and the presbyteries. For the analysis I did the 2010 numbers provide a consistent database with a specific snapshot date for comparison both within the presbyteries as well as between them.

Presbytery of Tropical Florida
With 55 of 56 churches reporting data for the close of 2010 the membership of the churches in the Presbytery was 13,291 based on adding all the individual churches and 13,425 from Table 4. The average size of a church was 242 members with a median of 127 members. For the 47 continuing churches the total membership is 10,137 with an average of 221 members per church and a median of 113 members. The nine dismissed churches have a total membership of 3,124, an average membership of 350, and a median membership of 188. Seven of the nine have memberships above the Presbytery median. As a percentage, 16.1% of the churches in the presbytery and 23.7% of the members in the presbytery were dismissed.

For comparison, the Layman reported that the total current membership of the Presbytery was 13,525 and the total current membership of the dismissed churches is about 3,800. This total membership number that is slightly higher than the PC(USA) number may reflect slight growth in the Presbytery or information on the missing numbers for the one church. The difference for the number of members dismissed is significantly larger and using the current numbers from the Layman results in 28.1% of the members being dismissed.

Presbytery of Mississippi
For the 43 churches in the Presbytery at the close of 2010 there is a total membership of 4,425 from adding the individual congregations compared with 4,485 from Table 4. The average membership is 103 members and the median is 47 members. The five dismissed churches have a membership of 1297 (29.3% of the total membership) with an average membership of 259 members and a median of 361. Three dismissed churches have memberships higher than the median of the whole group and are the three largest churches in the Presbytery. One church is the median of the whole group and one is below. The 38 churches remaining have a total membership of 3128, an average membership of 82 and a median of 43.

According to the Layman article the current total membership of the Presbytery is about 4,300 members from which the dismissed churches will remove 1,400 members or about 32.5%.

Since collecting the data and running the numbers above another presbytery in the area, Central Florida, has dismissed two churches.  I am not going to do the same comprehensive analysis for the presbytery right now (they list 75 churches so it will take more time than I have at the moment) but Table 4 lists a total membership of 27,193 giving an average per congregation of 363 members. The table lists the median church membership at 206. For the two dismissed churches Trinity of Satellite Beach has 877 members and First Presbyterian of Orlando has 3521 members. These 4,398 members account for 16.2% of the presbytery membership.

Discussion
One reason for undertaking this analysis is because these are large enough samples to try to quantify something that some of us have noticed – that the churches leaving the PC(USA) are on average larger than most of the other churches in the denomination.  With the past pattern of one church from a presbytery here and one from another presbytery there arguments could be made that this was not typical or comparisons were weak.  Now, however, with five churches from one presbytery and nine from another being dismissed in groups there is a more coherent data set.

As I note above, the churches dismissed in this round are larger than the average church in the presbytery based on both the average size and the median. For Tropical Florida the dismissed churches are on average 45% larger (350 versus 242) and for Mississippi 150% larger (259 versus 103). Similarly, the median is 76% and 668% larger for the dismissed churches.

It is worth noting that the average size congregation in the PC(USA) nationally in the 2010 data set is 191 and so while Tropical Florida has a larger average (242) and Mississippi a lower average (103) the average of the churches dismissed from each presbytery are larger than the national average (350 and 259). Similarly, the national median is 95 and all these relationships hold for that measure as well.

What first caught my attention regarding these numbers was the claim that one-third of each presbytery had been dismissed. I have noted previously that one-third/two-thirds splits seem to be one common division in Presbyterian divisions. In this case it is a bit lower than one-third, but still in the neighborhood and so it may hold in this case.

The problems with identifying this at this present time are however numerous. One issue is that additional churches may request dismissal so it is only a snapshot and not a completed process. Another is that while the churches have been dismissed there are likely some members who will be in a continuing church or who will remain in the PC(USA) joining neighboring churches. Another complication is that the dismissed churches are not all leaving together but some are going to ECO and some to the EPC. Finally, is it a reasonable thing to just look at individual presbyteries in isolation and ignore the big picture of the whole denomination.

What we can document from this is the fact that on average the churches that are requesting, and being granted, dismissal are larger churches. I can come up with numerous reasons for this but further work would be necessary to document whether there is one dominant reason. One possible explanation is that conservative churches tend to be more vibrant and viable and therefore be in a better position to attract and retain members. Another possible explanation is that larger churches simply
by virtue of their size are in a better position to strike out on their own, or join a fledgling group like the ECO, while smaller churches are dependent on some of the resources of the larger denominational structure, including monies paid into the pension plan. Those are just two of the several possible explanations.

It is worth noting that this trend does present challenges for the PC(USA). As we see in Tropical Florida it amplifies the membership losses when 16.1% of the churches leaving means that nearly one quarter of the members are dismissed with them. And in each presbytery you will note that the average size of church and the median size dropped after the dismissals.

While churches have been leaving for the EPC for a number of years now the dismissals to ECO have only just begun. It may be too early to reliably consider these numbers so we will see if this trend continues or if it changes with time. We shall see.

Presbyterian Church (U.S.A.) Releases The Latest Membership Statistics

Well, yesterday was July 1 – so a happy belated Canada Day to our friends north of the border.

It is also about the time of year that the Presbyterian Church (U.S.A.) releases their annual membership statistics and right on schedule the Stated Clerk released them yesterday.  While the full comparative statistics will take a little bit longer, now we have the Summary Statistics, Miscellaneous Information, and the Press Release. In addition, you can find commentary on the numbers from The Layman, and I would expect the Presbyterian Outlook to have an article shortly and probably a few more entities will weigh in as well. 

Running through the numbers I don’t see much change in direction of any of the categories.  Here are a few of the numbers and their change from 2009 to 2010.

 Category  2010
Value
 % change
from 2009
 Membership  2,016,091  -2.94%
 Churches  10,560  -0.91%
 Teaching Elders  21,161  -0.35%
 Candidates  1,189  +0.59%
 Ruling Elders  86,777  -3.62%
 Gain by
Profession of faith
17 and under
 18,895  -7.83%
 Gain by
Profession of faith
18 and over
 40,106  -4.71%
 Gain by
Certificate
 21,615  -13.34%

Yes, there are plenty more statistics but these are the ones related to membership that have a consistent trend, usually down, over the last three years. And yes, the PC(USA) is still above 2 million members so those that had numbers in the pool below 2 mil are out of luck, but at a loss of 61 thousand a year, we will see that next year.

The losses actually had some interesting variation this year.  For example, losses by certificate (transfer) have bounced around a bit but in this year the numbers bounced up 2,058 to 29,835.  That is still less than the 2008 losses by certificate of 34,340. Interestingly, the other losses, that is the people who left without transfer, hit a low for the last eleven years of 88,731, down from 100,253 last year.

So what does this mean in terms of breaking out the causes of decline.  The losses from transfer of members to the Church Triumphant (those that died) was 32,471 or -1.56%.  The internal replenishment rate in the form of youth joining the church was 18.895 or +0.91%. So our internal loss was 13,576 or  -0.65%.  By transfer the church gained 21,615 and lost 29,835 for a net of -8220 or -0.40%.  Adult profession of faith and other brought in 49,480 members while other losses were 88,731 for a net of -39251 or -1.89%.

Therefore, we can say that of the 2.94% decline, 0.65% is the deficit in internal replacement, 0.40% is the imbalance in transfers, and almost two-thirds is in the imbalance of those coming and leaving without formal transfer.

Regarding the ordained officers of the church there is a bit less clarity.  This first release always gives the total number of teaching elders (ministers) but we will have to wait a bit longer for the release of the bigger report to know how many are active ministers and how many are honorably retired. Last year, of 21,235 ministers 13,400 were listed as active.

The number of ruling elders listed I usually figure is the number currently serving on session.  With 10,560 churches and 86,777 elders that comes to an average of 8.22 per church.  (In case you are interested that is down from 9.26/church in 2001.)  The interesting thing of course is that while this is labeled “elders” we know it is not all the elders because the last Presbyterian Panel report says 21% of the members of the church have been ordained as ruling elders — so there should be closer to 423,379.  (An interesting juxtaposition with a workshop at Big Tent yesterday where the message was that “Being an elder is a ‘perpetual calling.'”)

Finally, I am never sure what to do with the candidates line because the full statistics always have a different number, a difference I have attributed to taking the “snapshot” at different times during the year.  For example, the new summary lists 1182 candidates in 2009 while the full comparative statistics list 1154. Another reason for the difference could be the data coming from different sources.

Anyway, for what follows I will just use the numbers as they appear in this preliminary release and the equivalent ones from earlier years.

I wanted to look at how all these categories are changing with time and relative to one another.  So taking the data back to 2001 I normalized each category to that year.  That is to say I took all the data in a category and divided it by the 2001 value so they all start at a value of 1 for that year and proportional changes can be seen more clearly.  Here is what I get:


Now we can see that the fastest declining category is the total membership of the church closely followed by the number of ruling elders.  One interpretation is that ruling elders are departing the church at almost the same rate at other members, but that would not be correct.  Remember that this number is actually a measure of those serving on sessions so it means that sessions are decreasing in size proportionate with the decrease in membership, not the decrease in the number of congregations.  I’m open to suggestions about why this might be – smaller sessions for smaller churches? smaller sessions to be more efficient? smaller sessions because the pool of ruling elders is decreasing?  An interesting topic for future thought.

For the other numbers, the number of churches has decreased slightly (5% over 10 years), the number of teaching elders has held very steady over that time, and the number of candidates has shown significant growth.  Clearly we have a window of opportunity with this abundance of candidates to revitalize congregations and develop those 1001 new worshiping communities.

At this point I think I’ll wrap this up leaving the finances completely untouched.  Echoing the sentiments of the Stated Clerk, I have found Presbyterians to be a very generous bunch, especially when the mission is compelling.  So the question is, with the denomination positioned in this present situation what compelling mission is out there for the financial and human resources that are at our disposal. There is apparently a lot of talent in the pipeline — I hope they are ready for some creative and out-of-the box ministry.

The PC(USA) Does Appear To Have A “Lightning Rod”

I have two polity-heavy posts that I have been working on and decided to take a break from those to exercise the other side of my brain and crunch some numbers…

In the initial letter introducing the Fellowship PC(USA) the statement is made

“Homosexual ordination has been the flashpoint of controversy for the last 35 years.”

On most levels I take issue with this because in a larger sense Presbyterians around the world have throughout their history been debating scriptural and confessional imperatives and implications and this is only the latest specific detail over which the discussion is continuing.

But on a more practical level this statement seems to hold a fair amount of validity to me based on my personal experience.  For the last several votes on changing Book of Order section G-6.0106b it has always struck me that my own presbytery had significantly higher attendance for the amendment vote meeting than for regular meetings.  Even at the beginning of the debate, for our vote to include the current “fidelity and chastity” language in the constitution we had 284 commissioners vote.  A couple of meetings later a very contentious issue had 202 commissioners vote.  The pattern still continues today as I have had more than one commissioner ask me when our presbytery is voting and when I mention the different meetings for the different amendments they tell me they only want to know about Amendment 10-A.

Well, with the voting this year I have an ideal data set to test whether this observation holds in other presbyteries as well.  Short answer – YES!

First, the usual comments on the data I use:  My data is aggregated
from numbers from Twitter as well as vote counts at the Covenant Network, Yes on 10-A, Reclaim Biblical Teaching and the Layman.
This aggregation is available in my spreadsheet through this past weekend’s
reports.  Because I will be looking at voting on all three major issues — Belhar, nFOG and 10-A — the Layman and Reclaim Biblical Teaching charts provide the full data set.  (Note how this in itself is suggestive of my hypothesis about the focus on the 10-A voting as that is the only one followed by all four of these sources.)

Now there are 55 recorded votes for the Belhar Confession, 62 for the nFOG, and 115 for 10-A.  (Again, suggestive of the higher-profile nature of 10-A and the need for a recorded vote.)  Of these we have 39 recorded pairings of Belhar and nFOG, 36 pairings of Belhar and 10-A, and 45 pairings of nFOG and 10-A.

For those 39 presbyteries with recorded votes on Belhar and nFOG the ratios between the two range from having 31% more votes for Belhar to having 40% less.  But the average and median are right at 1.00 indicating that on balance the turnout is the same for those two issues with a fairly symmetric distribution around that.

For the 36 presbyteries that have recorded votes on both 10-A and Belhar there are, on average, 12% more commissioners voting on 10-A than Belhar with the range from 75% higher to 13% lower.  The comparison of nFOG to 10-A for those 45 presbyteries is very similar with the average 13% higher for 10-A and the range from 63% higher to 12% lower.  With medians at 7% and 5% respectively, the distributions are clearly not as symmetric, having extended tails at the higher end.

I am sure that several of you have already started complaining about the problem with the analysis that I just did – the three votes are not always three independent events but in many cases multiple votes are taken at the same meeting and so, with the exception of a few commissioners who only come for the one vote they are interested in, the total number of votes cast should be, and in several cases are, nearly identical.  (The other thing that could cause minor fluctuations is the fact that I don’t include abstentions.)

So, my first point is that in spite of not accounting for independent events the numbers are so robust that the upward shift is visible in this mixed data set.

Well, as much as I would like to separate these out into independent data sets, I have not personally kept a time history of the voting to be absolutely certain of which votes were take at the same meeting and which were not. (If any of you have that information please do the analysis of independent events and let me know how far off I am.)  I can tell you several votes were taken at the same meeting and in fact these are very obvious in the posted spreadsheet having only a vote or two variation in the numbers.  But let me try to separate out the different votes using my usual criteria that a 4 vote difference or a 4% difference is normal fluctuation and vote totals within this range will be treated as having happened at the same meeting.  Also, from here on I will only consider the comparison of the Belhar and 10-A votes for two reasons: 1) My earlier work showing the closer correlation of these two votes still holds, and 2) it is my impression, and only my impression, that presbyteries are tending to do these votes at different meetings more than splitting nFOG and 10-A. After the voting is over I’ll revisit this topic with the final data set and I suspect that we will find a bimodal distribution to help us answer this question.

So, of the 36 presbyteries with recorded votes on both Belhar and 10-A , 20 have noticeable differences in the number of votes.  Eighteen of those are higher for 10-A and two are higher for Belhar.  Of the ones higher for 10-A they range from 7% higher to 75% higher and have an average increase of 24% with a median increase of 18%.  While tempting to do the full frequency distribution analysis at this point, I will save that for a while until there are more data.

Now, accepting the fact that one of my analyses certainly includes dependent events and the other probably has unfairly eliminated independent events, it is still clear that a vote on “fidelity and chastity” brings out the commissioners more than a vote on changing the Book of Confessions.  Like it or not, we have to accept the premise from the Fellowship PC(USA) letter that there is a “flashpoint” or “lightning rod” in the denomination.

Before bringing this exercise to a close, let’s ask the obvious question – “Was the increase in commissioners who voted yes or voted no?”  The answer is both, but while there is significant variability between presbyteries, it was the no voters who tended to show up for the vote on 10-A.  And yes, this is based on the presumption that a commissioner that voted one way on Belhar was going to vote the same way on 10-A so the other way to look at this is that there was a trend for more uniform commissioner turn-out with some commissioners that voted, or would have voted, yes on Belhar to vote no on 10-A.

In terms of the specific numbers, the average number of yes votes increases 7% while the number of no votes more than doubles, rising 102%.  However, these are influenced by a couple of presbyteries with a small number of votes in a given column that when they pick up just a few more votes becomes a large ratio.  For example, North Alabama had 3 no on Belhar and 28 no on 10-A giving a nine-fold increase.  Another case is Central Washington which went from 7 yes on Belhar to 12 yes on 10-A for a 71% increase.  With the extreme values present considering the median value of each data set (the value for which half are above and half are below) is more reasonable.  Still, the median number of yes votes is up 4% and the median of the no vote increase is 28%.

So when presbyteries have important issues to discuss it appears from this data that commissioners are more likely to show up when the issue is G-6.0106b.  I have to agree that for the last few decades the “issue de jour” for the mainline Presbyterians has been sexual orientation and practice, particularly as it applied to those who hold ordained office.  But throughout the history of Presbyterianism other issues, such as church-state relations and confessional subscription and standards, have been the flashpoint over which we have debated, and divided. (It would be interesting to know if presbytery meeting attendance increased for votes on modifications to the Westminster Standards earlier in our history.)  It also leads to the interesting question of what will become the “issue de jour” if 10-A passes.  I think many would see the denomination moving on and rather than staying with modifications to G-6.0106b the next discussion point will probably be the definition of marriage (W-4.9001).  But maybe it is something else that does not come to my mind at the moment.  And the question of whether we Presbyterians need an issue as the focus of our debate is a topic for another time.  We will see what develops over the next few years.

A Brief Update On Cross-Issue Correlation In PC(USA) Amendment Voting

Not Much Change.

Brief enough for you?  OK, you want a bit more?

As I have commented the past couple of posts on this stuff there was a bit of a question about the data, particularly for the New Form of Government vote, because the official tally from the Office of the General Assembly differed markedly from the unofficial “word on the street” numbers.  Well, in the last week the unofficial lists have caught up and as of today’s release of the official numbers the differences have mostly disappeared.  The official numbers always lag a bit because of the extra time required to report the votes (the Stated Clerk still does not accept reports from those of us who tweet it).  So, at the moment the numbers are: Belhar – Official 44-23, unofficial 46-29; nFOG – Official 50-33, unofficial 50-39; Amendment 10-A – Official 57-37, unofficial 67-47.

For those playing along at home it appears that nFOG and 10-A are on track to be affirmed by the presbyteries but the Belhar Confession is very close and trailing the 2/3 confirmation it needs to join the Book of Confessions.  Of the other 14 amendments being voted upon, 11 have now been affirmed by the presbyteries and the other three are seeing some degree of negative votes, but still on track to be affirmed.

As usual, my data is aggregated
from numbers from Twitter as well as vote counts at the Covenant Network, Yes on 10-A, Reclaim Biblical Teaching and the Layman.
This aggregation is available in my spreadsheet and my cross-vote spreadsheet through yesterday’s
reports.

What appears to have happened is that the Layman and Reclaim Biblical Teaching lists, which are the ones that track the Belhar and the nFOG voting, have gotten a bunch of new data on nFOG.  The good news is that new data is always good and makes my analyses more reliable.  The bad news is that most of these are unrecorded votes meaning that either the vote was taken by voice or a show of hands and not counted out, or the counted data was not available to the reporting groups.  The bottom line is that there is not much new for my strength of voting analysis so I’ll let my previous one stand for the moment and just look at the cross-tabulation of the “yes/no” votes.

So here are the correlations:

 n=52 Belhar
Yes
Belhar
No
 nFOG yes  25
48%
3
6%
 nFOG no  4
8%
20
38%
n=48 Belhar
Yes
Belhar
No
 10-A yes  27
56%
3
6%
 10-A no  4
8%
14
29%
 n=65 nFOG
Yes
nFOG
No
 10-A yes  28
43%
 6
9%
10-A no 10
15%
 21
32%

Well, there is a little shift, but the same quantitative pattern holds and the shifts are mostly minor.  The Belhar/nFOG correlation is now tied for the best with 86% of presbyteries voting the same way on both issues (down slightly from 90% last time) and 14% opposite voting.  Yes, now Belhar Yes/nFOG No has one more vote than the other combination while it had one less vote last time — still pretty similar.

The Belhar/10-A correlation is interesting because it has the same number of opposite voting presbyteries as the Belhar/nFOG correlation and within the rounding this gives essentially the same percentages – 14% opposite and 87% same.  The previous analysis had 83% of the presbyteries voting the same so there is a slight increase in the correlation.

Finally, we have the least favorable correlation, the nFOG/10-A voting, and the numbers are very close to the previous analysis, and maybe a bit better correlated.  Previously, within the rounding, 75% of the presbyteries voted the same way on both issues and now we have the same number again.  Last time the opposite votes were the same and now we see a slight tendency for a presbytery that votes no on 10-A to still support a yes vote on nFOG.  In fact, since the previous analysis only one more presbytery has been added to the count that voted yes on 10–A and no on nFOG.

What does it all mean?  Well, for the data crunchers like me it is nice to see that the larger quantity of data supports the preliminary analysis I did before.  We are still only at about 1/3 of the presbyteries in any one of these comparisons so this is still in a preliminary mode, but it is valuable to see that as the data set grows the basic trends remain the same.  It is also suggestive that we can have some confidence in the previous analysis that used the strength of voting.

It also continues to encourage us to ask the question of why these votes are correlated.  I’ve pondered that in the previous posts so won’t repeat it again in print until the data set has filled out substantially more.  Some of you have suggested additional variables to look at with the strength of vote numbers to help clarify that question a bit.  When the strength of vote data has increased some more I’ll revisit it again.

This continued trend does not however allow up to say anything about the time trend of the data.  These data have no associated date information and just because they were added in the last week does not mean that the votes were taken in that time period.  So while they have the same trend we can not say that the voting trend truly “continues” in a temporal sense.

Anyway, a little lunch hour diversion and we will watch the voting continue and await more data.  The rest of this week I have a heavy meeting schedule so I’ll try to catch up on some global Presbyterian issues over the weekend.  Stay tuned…

Cross-Issue Correlation In PC(USA) Amendment Voting

OK, I need to get two things onto the table right at the beginning of this post:


  1. Yes, this is an extremely geekish and polity wonkish post, but that’s what interests me and this analysis is the one I have really wanted to do since the 219th General Assembly adjourned last July.  I do think there is something important about the PC(USA) in here so if you want to skip the data analysis and jump to the end you will find my discussion there.
  2. I posted a preliminary result on Twitter on Saturday but got the variables confused.  Sorry about that. I posted a correction on Twitter and will point out the error when I come to it in this post.
So, the question that has had me on the edge of my seat is the degree to which each of the three high-profile amendments is correlated with the other two.  I took an initial pass at this question a couple of weeks back and found a strong correlation.  That correlation has weakened a bit but is still present, stronger in some relationships than others.  While it still may be a bit premature to make strong conclusions from the data at this point in time, I think I’ve got enough data to do a preliminary analysis.

Now, if you are looking for just the vote results after last Saturday here is the “word on the street.” Belhar is still not getting the 2/3 it needs with 32 yes and 28 no.  The New Form of Government continues to have weak support and still trails, currently at 25 to 31.  The story of the last week is that support for Amendment 10-A continues at the pace we have seen throughout the month and with three more presbyteries switching their votes a total of 12 presbyteries have shifted to “yes” with only one shifting to “no.”  At this point enough presbyteries have shifted (a net of nine was needed) that with all the rest of the presbyteries voting as they did in the last round Amendment 10-A will be approved. At the end of the weekend the vote stood at 55 to 41.  No further analysis of that today, I’ll come back to that in another week or two. (Particularly in light of the question about the vote totals that is raised at the end of the next paragraph.)

First, the usual details regarding data:  For my data I have aggregated numbers from Twitter as well as vote counts at the Covenant Network, Yes on 10-A, Reclaim Biblical Teaching and the Layman. This aggregation is available in my spreadsheet.  I have also updated my cross-vote spreadsheet through Saturday’s reports.  The analysis below is more sensitive to the exact vote count and where the tally sheets sometimes differ a bit I have used either a majority among them, the Twitter reports, or a consistency in total votes to select a preferred number.  This is also probably a good place to add that the voting is not finished yet and this analysis is only preliminary based on the current data. And in a very interesting development today as I was finishing this up, the official vote tally from the Office of the General Assembly was posted.  It has caught the attention of several of us because it has numbers significantly different than the unofficial sites — nFOG 38 to 25, Belhar 38 to 18, and 10-A 47 to 33.  The difference is presumably due to reports by presbytery stated clerks not reflected in the unofficial counts.  Hopefully with time the two sets of lists will converge.

So, let’s take the three comparisons from strongest to weakest (and if you want to see the graphs in more detail they are larger in their original form and you can open them individually):

Belhar to nFOG
The strongest relationship between the issues is between the votes on the Belhar Confession and the New Form of Government. (This is the one I should have pointed out in the tweet on Saturday.)  So far 33 presbyteries have voted on both of these issues, and 27 of those have recorded vote numbers on both votes.  Looking at the numbers you can see the strength in both the cross-tabulation and the linear regression:

















 n=33 Belhar
yes
Belhar
no
 nFOG yes
 10
30%
 2
7%
 nFOG no
 1
3%
 20
60%



Bottom line: The strength of a presbytery’s vote on nFOG is going to be very close to the strength of a presbytery’s vote on Belhar.  The fit of the linear line is good with an R2 = 0.73  (a number very much like correlation that I talked about in a previous post with 1.0 as a good and 0.0 as not correlated, but this number is always positive), and a slope pretty close to 1 (the two vote percentages increase in the same proportion).  This is seen in the yes/no comparison where 30 presbyteries have voted the same way on both issues and only 3 (10%) have voted opposite on them.

Belhar to Amendment 10-A
The next strongest relationship between the issues is that between the votes on the Belhar Confession and Amendment 10-A.  (This is the one I incorrectly pointed to in the tweet.) So far 35 presbyteries have voted on both of these issues, and 25 of those have recorded vote numbers on both votes.  Here is what the numbers look like:
















 n=35 Belhar
yes
Belhar
no
 10-A yes
 17
49%
 3
9%
10-A no
 3
9%
 12
34%



Bottom line: The strength of a presbytery’s vote on Amendment 10-A is going to be related to the strength of a presbytery’s vote on Belhar, but not as strongly as for the last case and not in 1:1 proportion.  In this case, the fit of the linear line is not as good, but still moderate, with an R2 = 0.62 and a slope 0.51. There is also a significant upward shift in the trend line of almost 20%.  What this means is that for presbyteries not strongly in favor of Belhar, on average there is a 20% “base” in favor of Amendment 10-A.  On the other end, a presbytery strongly in favor of Belhar has, on average, a 30% “base” opposed to Amendment 10-A.  The yes/no comparison also shows that the linkage is not as strong and direct where 29 presbyteries have voted the same way on both issues and six (18%) have voted opposite on them.  From these results, the association of these two issues is only partial and the attitudes on one are not driving the other as strongly as might be suspected.

nFOG to Amendment 10-A
The weakest relationship is between the votes on the nFOG and Amendment 10-A. So far 36 presbyteries have voted on both of these issues, and 23 of those have recorded vote numbers on both votes.  Here is what the numbers look like:
















 n=36 nFOG
yes
nFOG
no
 10-A yes
 12
33%
 5
14%
10-A no
 5
14%
 14
42%



Bottom line: There is a weak, positive relationship between a presbytery’s voting strength on nFOG and the vote strength on 10-A.  However, as can be seen in the scatter of the data on the graph, especially at the higher end the relationship is weak.  The scatter in the data is evident with R2 = 0.39 and the lower slope of 0.46 also suggestive of a weaker linkage. The yes/no comparison supports  that the association is not as strong and direct with almost 1/3 of the presbyteries voting opposite ways on the two issues.

Discussion and Conclusions
I must admit that the strength of the Belhar/nFOG association was a bit of a surprise to me.  With the on-going discussion of the synergy between Belhar and 10-A I was expecting to that to have the strongest correlation. And the very nearly 1:1 association means that they two issues probably elicit the same response from any given commissioner.  One thought that occurred to me is the similar nature of these two issues in regards to their impact on PC(USA) polity.  While the impact of each is still being debated and is, to a certain degree, unknown, if approved they each would leave a significant mark on the constitutional documents.  There could also be a less tangible factor in the willingness to preserve the status quo — since these two amendments have similar impacts on the established order of things it is reasonable to presume that if a commissioner had a particular comfort level with changing one of them, they would have a similar comfort level changing the other. But whether it is related to those explanations, or other factors, the data appear to show that even if presbytery commissioners don’t necessarily explicitly link them, they still seem to think about them in the same way.

Having said that, and recognizing the vote tally differences from today’s announcement, I need to point out that it appears point twice as many presbyteries have voted against both of them as have approved them.  This raises a couple of questions when we look at the voting trends for the issues by themselves since the votes overall are more even.  The first thing is that as the double-issue voting catches up the close agreement could go away.  But if the close agreement continues, and considering that one currently has a majority and the other does not, we might expect the vote margins to narrow.  We also open up the possibility that Belhar might not even receive a majority vote if nFOG continues to not receive a majority.  The opposite could also be true – that nFOG will be pulled up by future positive voting on Belhar.

We could also ask the question about the strength of Belhar from the 10-A relationship.  Doing a back of the envelope calculation and extrapolating out the 10-A voting based on current proportions a 99 to 75 final vote (56.6% yes vote) would be a reasonable conclusion.  If we then mix apples and oranges and ignore whichis the the dependent and which the independent variables, plugging 56.6% yes vote on 10-A into the regression formula gives a 73% yes vote on Belhar.  Fun to speculate but I just violated too many mathematical and data rules to really believe that.  A more valid approach would be to take the presbytery yes/no vote cross-tabulation as a guide where we see that at the present time the opposite voting categories would off-set each other.  This would suggest that for presbyteries (apples to apples) the Belhar final vote could would be very close to the 10-A final count, in which case 56.6% won’t get it approved.

I’m not sure there is much to say about the weak correlation between nFOG and 10-A.  This is more of what I was initially expecting since the two issues do not have a lot in common polity-wise.  The weak linkage seen could be some polity point I am overlooking or a desire to preserve the status quo.  Either way, there is not enough strength in that correlation to risk making any conclusions about one from the outcome of the other.

So that is what I see at this point.  I will point out again that this is truly preliminary since at this time for each pairing only around 1/5 of the presbyteries have voted on both amendments.  I look forward to seeing how this progresses as the voting continues filling in the missing data.  Stay tuned…

PC(USA) Amendment 10-A Voting About To Reach Half-Way Point

There has been a flurry of presbytery voting this past week with some interesting developments.  Here is a quick summary and some observations.

Following presbytery meetings last Saturday it appears from the reports that 81 out of the 173 presbyteries have voted on Amendment 10-A, quickly approaching the half-way mark of 87 presbyteries.  A potentially bigger development is the flurry of presbyteries that have voted “yes” on 10-A after voting “no” on 08-B in the last round.  The number of presbyteries switching now stands at a net change of eight towards “yes,” with nine total switching to “yes” and one switching to “no.”  Since a net change of nine is necessary for the passage of 10-A (it was 78 “yes” and 95 “no” last time) if the current trend continues it is reasonable to expect that 10-A will be approved.  However, don’t take that as a done deal because 1) part of being Presbyterian is the process and 2) just as there was a flurry of “yes” changes this weekend there could as easily be a momentum shift with a number of “no” switches in the future.  Oh, and if you are keeping count I think the vote is 46 “yes” and 35 “no.”

One of the interesting things in the past few weeks was how the three votes were tracking together — That has changed somewhat.  The first observation is that while there was a burst of voting on 10-A, there was not a corresponding burst on Belhar or nFOG.  At the present time 51 presbyteries have voted on Belhar and 47 have voted on nFOG.  Breaking it down, I have 12 presbyteries that have voted on all three amendments, 14 that have voted on Belhar and nFOG but not 10-A, 15 that have voted only on nFOG and 10-A, and 13 that have voted on 10-A and Belhar only.  That gives a total of 61 presbyteries (including my own) who have not voted on any of the amendments yet.

The second thing that struck me was a bit of a weakening of the cross-issue correlation I commented on a little while ago.  While I have not done a full recalculation of my chart to include Saturday’s voting, looking at the numbers it seems there have been a few presbyteries who have voted “yes” on 10-A and “no” on nFOG, to the point that while 10-A is currently passing nFOG is trailing 21-26.  I don’t know if it is this trend, or just a coincidence, that a few days ago GA Moderator Cynthia Bolbach in her monthly column encouraged passage of the new Form of Government and pointed readers to the nFOG blog. ( And yes, Ms. Bolbach’s statement to avoid nFOG advocacy applied only to the sessions of the General Assembly and not the voting period.) And if you are keeping score at home, both Belhar (needs 2/3 to pass) and nFOG are currently trailing, the former 28 to 23 and the latter 21 to 26.

I will leave further analysis of Belhar and nFOG for another time as well as the cross-issue trends.  But taking a more detailed look at 10-A voting we have 73 presbyteries with reported numbers for their votes on both 08-B and 10-A.  I have aggregated these numbers from Twitter as well as vote counts at the Covenant Network, Yes on 10-A, Reclaim Biblical Teaching and the Layman. This aggregation is available in my spreadsheet.

At the present time the total reported number of voting commissioners is 8635, down 8% from the corresponding 08-B total of 9337.  Votes for 10-A have increased slightly from last time, 4602 to 4726, a 3% increase.  Votes against have dropped 17% from 4735 to 3909.

In the chart below I try to graphically show the different results from the presbyteries.  I use my usual margin of a 4% change (or 4 votes for small numbers) being random variation, and so the numbers in that range are considered equivalent for this analysis.  And for the chart below, the comparisons mentioned (Y>N, Y<N, Y=N) are the magnitude or the absolute value of the change in Yes and No votes.  For example, if Yes votes decreased by 15 votes and No votes increased by 6 votes, that would be counted under the “Y decrease, N increase, Y>N” box.  I hope that makes sense.

  Y increase
N decrease
Y > N

n=8
11%

Y increase
N decrease
Y < N 

n=13
18%

 Y increase
N decrease
Y = N

n=4
5%

N no change
Y increase

n=7
10%

Y increase
N increase
Y > N

n=1
1%

Y increase
N increase
Y = N

n=0
0%

Y increase
N increase
Y < N 

n=0
0%

 Y no change
N decrease

n=13
18%

 Y and N
no change

n=4
5%

 Y no change
N increase

n=3
4%

Y decrease
N decrease
Y < N 

n=4
5% 

Y decrease
N decrease
Y = N

n=6
8%

 Y decrease
N decrease
Y > N

n=4
5%

 N no change
Y decrease

n=3
4%

 Y decrease
N increase
Y = N

n=2
3%

Y decrease
N increase
Y < N

n=0
0%

Y decrease
N increase
Y > N 

n=1
1%

 

See any patterns?  There is a tendency for “no” votes to decrease — in 10% of the presbyteries they increase, in 19% the no votes are constant, and 71% of the time they decrease.  And there is a weaker tendency for yes votes to increase — in 45% of the presbyteries it increases, in 27.5% they remain the same, and in 27.5% they decrease. But if you are looking for patterns of no decreases or yes increases it is tough to make a strong argument for a consistent behavior across all the presbyteries.  The best we can say is that the two cases of decreases in “no” with stable “yes” and decreases in “no” with smaller increases in “yes” comprise about 1/3 of the presbytery vote changes.  The other 2/3 are more evenly distributed across a greater variety of cases.

OK, eyes glazed over?  The object of this extensive enumeration is to make the point that there is little in the way of strong trends that one can point at.  Is the trend for shifting from “no” votes to “yes” votes?  Yes, in several presbyteries like Central Florida where the total number was stable (a 3 vote/1% drop) but there were 17 more “yes” votes and 20 fewer “no” votes. And then there is Stockton where there were 50 votes each time but five votes shifted from “yes” to “no.”  Yes, we can say that there are fewer “no” votes overall, but sometimes that comes at no increase in “yes” votes, as in the case of Cimarron, and sometimes with a substantial decrease in “yes” votes as well, such as happened in Heartland.

Bottom line – there are a few trends but if you are looking for easy explanations (like “the conservatives are leaving” or there is a “shift to equality” ) it is hard to tease that out as a simple rule when you look on a case-by-case basis at presbytery voting.  Presbyteries are amazingly unique entities — that is what I have found in my years of tracking this stuff.  (And that does not even include consideration of weather conditions, wind direction, what show in on in prime time that evening, or who is having a conference in Phoenix.)  Believe me, I would love easy answers.  But I have lost count of the number of numerical models I have made that are either solvable but too simplistic or complex but underdetermined.

So we will see how the voting goes in the next few weeks.  We are getting enough data that I can start calculating robust statistics and frequency distributions like I have in the past.  And I will try to keep the cross-tabulation above updated as well as the cross-issue correlation chart.  So stay tuned…

Drilling Down In The Religious Life Survey — Is Church Attendance Really That Good An Indicator?

I don’t know how many other bloggers post something and then spend the next 24 hours second guessing themselves.  In this case, one of my conclusions yesterday was nagging at me and in a sense of academic honesty I just had to know if in my treatment of the data I had fooled myself and any readers along the way.  So, being the geek that I am I decided to drill down into that one particular survey question to see what else there was to see.

The conclusion that was nagging me was the sensitivity or “high bar” of church attendance as correlated to the growth or decline of denominations.  As part of the analysis I combined some categories in the survey and did not discuss the actual numbers from the survey.  So to remedy that here is an expanded analysis of that single question.  Those who are squeamish over statistics or don’t feel particularly geeky might want to turn away now — this analysis clarifies and qualifies some details but does little to change the overall conclusion I reached yesterday.

To recap, I am working with two data sets.  The first is the National Council of Churches list of the 25 largest denominations, especially the 14 of those that reported growth rates for 2010.  The second is The U.S. Religious Landscape Survey dataset from the Pew Research Center.  The resulting analysis and data manipulation is mine and it
should be kept in mind that “The Pew Research Center and the Pew Forum
on Religion & Public Life bear no responsibility for the analyses or
interpretations of the data presented here.”  For consistency I will again use only the data for the 48 contiguous United States and will not implement their weighting scheme.

In this analysis I want to look at only two questions in the survey.  The first is the multi-part question that established a respondent’s religion or denomination.  This was user supplied and provided some interesting results, as you will see in a minute.  I want to compare that affiliation information against the question “Aside from weddings and funerals, how often do you attend religious services… more than once a week, once a week, once or twice a month, a few times a year, seldom, or never?”

So, for the 14 denominations on the top 25 list that provided information, here are the results for that question.  I have ranked them by growth rate and include total respondents with each answer as well as the percentage.

Denomination 2010
Growth
Rate
(NCC)
Attend
more than
once a week
Attend
once a
week
 Attend
once or
twice a
month
Attend a
few times
a year
 Attend
seldom
Attend
never
No
Answer
 Jehovah’s
Witnesses
 4.37%
 158
74.2%
 21
9.9%
 7
3.3%
 13
6.1%
 9
4.2%
 4
1.9%
 1
0.5%
 Seventh-Day
Adventist
 4.31%
 35
25.9%
 56

41.5%

 14

10.4%

 13
9.6%
 8
5.9%
 9
6.7%
 0
0.0%
 Church of Jesus
Christ of Latter
Day Saints
 1.42%
 184
33.1%
 256
46.0%
 43
7.7%
 34
6.1%
 24
4.3%
 15
2.7%
 0
0.0%
 Catholic Church  0.57%
 842
10.5%
 2814
34.9%
 1471
18.3%
 1539
19.1%
 953
11.8%
 399
5.0%
 36
0.4%
 Assemblies
of God
 0.52%
 225
46.9%
 135
28.1%
 44
9.2%
 38
7.9%
 26
5.4%
 11
2.3%
 1
0.2%
 Church of God
(Cleveland, TN)
 0.38%
 65
52.4%
 24
19.4%
 15
12.1%
 15
12.1%
 3
2.4%
 2
1.6%
 0
0.0%
 Southern Baptist
Convention
 -0.42%
 846
33.3%
 697
27.5%
 347
13.7%
 336
13.2%
 220
8.7%
 81
3.2%
 12
0.5%
 United Methodist
Church
 -1.01%
 248
11.1%
 782
34.9%
 446
19.9%
 456
20.4%
 243
10.9%
 54
2.4%
 10
0.4%
 Lutheran Church-
Missouri Synod
 -1.08%
 40
6.8%
 225
38.3%
 138
23.5%
 114
19.4%
 56
9.5%
 13
2.2%
 2
0.3%
 American Baptist
Churches in the USA
 -1.55%
 70
17.0%
 114
27.7%
 80
19.5%
 82
20.0%
 46
11.2%
 16
3.9%
 3
0.7%
 Evangelical Lutheran
Church in America
 -1.96%
 69
7.9%
 359
41.3%
 199
22.9%
 158
18.2%
 69
7.9%
 14
1.6%
 1
0.1%
 Episcopal Church  -2.48%
 41
8.6%
 144
30.4%
 101
21.3%
 106
22.4%
 61
12.9%
 16
3.4%
 5
1.1%
 Presbyterian
Church (U.S.A.)
 -2.61%
 71
13.1%
 238
43.8%
 102
18.8%
 81
14.9%
 44
8.1%
 6
1.1%
 2
0.4%
 United Church
of Christ
 -2.83%
 21
8.5%
 81
32.7%
 46
18.5%
 62
25.0%
 27
10.9%
 10
4.0%
 1
0.4%

Well, instead of combining categories I ran correlation statistics on all six meaningful responses.  (You could argue that not responding is meaningful, and looking at the numbers there is a case to be made – why do more Episcopalians not want to respond? – but that is a topic for another time.)   However, from crunching the numbers the first time I noticed that responses from those affiliated with the Catholic Church were frequently outliers, something I pointed out in the first post and something that can be seen in this data set.  It has been observed in other reports that cultural and immigration factors play a larger role in membership numbers for that denomination so I have chosen to exclude those responses from my analysis.

Today, the correlation statistics I calculated include both the linear correlation coefficient as well as the rank correlation.  I won’t go into that latter statistic, except to say that it is a good test for leveraging by extreme values and for none of the responses was that significant, and the only response for which it might have a slight effect is “attend once or twice a month.”

Now it turns out that my combining response categories yesterday may not have been a good way to treat the data because the correlation for “once a week” was not only pretty low, but it was inverse at that.  The only category for which there was a meaningful positive correlation (0.74) was “attend more than once a week.” For “attend once or twice a month” and “attend a few times a year” there are pretty strong negative correlations (-0.84 and -0.81 respectively).  I feel better — While my combining categories may not have been the best move, it appears that it does not substantially change the “high bar” I saw that having the correlation with even “once or twice a month” being related to decline.  At this point I feel I can stick with yesterday’s conclusions.

But having embarked on this data exploration, let me continue with a couple new analyses.

First, using the strongest positive and negative correlations let me ask, “where is the line between growing and declining.”  Now, remember this is only a guideline and not hard and fast, but if we run a linear regression on “more than once a week” we find that using this as a predictor tells us that denominations that have more than 27.5% of affiliated respondents answering in that category were growing.  Looking at the table above (and remembering to skip the Catholic Church) we see that indicator holds up pretty well.  If we do the same with “once or twice a month” we get a predictor that tells us that growing denominations have less than 14.9% of affiliated respondents give that answer.  Again, in the table above this holds up with only one exception.  So while not perfect, these two numbers give a pretty good proxy for predicting growth or decline.

So lets apply these numbers.  First, what about non-denominational churches?  While they don’t represent a denomination, by definition, and we don’t have NCC growth data for them, let’s have a look at the attendance statistics for the three most frequently reported nondenominational categories in the Religious Landscape Survey.

Category Attend
more than
once a week
Attend
once a
week
 Attend
once or
twice a
month
Attend a
few times
a year
 Attend
seldom
Attend
never
No
Answer
 Nondenominational
Evangelical
 138
33.4%
 171
41.4%
 62
15.0%
 22
5.3%
 12
2.9%
 7
1.7%
 1
0.2%
 Nondenominational
Charismatic
 74
43.0%
 51

29.7%

 17
9.9%
 10
5.8%
 16
9.3%
 4
2.3%
 0
0.0%
 Nondenominational
Fundamentalist
 41
39.8%
 29
28.2%
 14
13.6%
 11
10.7%
 8
7.8%
 0
0.0%
 0
0.0%

As you can see, all three have “more than once a week” numbers above the indicator, and two out of three have “once or twice a month” numbers below that indicator – and the third misses by only 0.1%.  The indication is that if these were denominations we would expect them to be growing.

OK, lets get close to home — What about Presbyterian Groups?  The survey has 22 self-reported categories of Presbyterians.  Here are a few of the more frequently reported one.

Denomination Attend
more than
once a week
Attend
once a
week
 Attend
once or
twice a
month
Attend a
few times
a year
 Attend
seldom
Attend
never
No
Answer
 Presbyterian
Church (U.S.A.)
 71
13.1%
 238
43.8%
 102
18.8%
 81
14.9%
 44
8.1%
 6
1.1%
 2
0.4%
Presbyterian Church
in America
 30
17.9%
 43
25.6%
 37
22.0%
 34
20.2%
 18
10.7%
 5
3.0%
 1
0.6%
 Associate Reformed
Presbyterian
 3
23.1%
 5
38.5%
 2

15.4%

 1
7.7%
 1
7.7%
 0
0.0%
 1
7.7%
 Orthodox
Presbyterian
 2
25.0%
 3
37.5%
 0
0.0%
 1
12.5%
 1
12.5%
 1
12.5%
 0
0.0%
 Evangelical
Presbyterian
 6
50.0%
 5
41.7%
 1
8.3%
0
0.0%
 0
0.0%
 0
0.0%
 0
0.0%
 Conservative
Presbyterian
 1
100%
 0
0.0%
 0
0.0%
 0
0.0%
 0
0.0%
 0
0.0%
 0
0.0%
 Presbyterian
(other not specified
evangelical)
 7
13.7%
 17
33.3%
 13
25.5%
 8
15.7%
 5
9.8%
 1
2.0%
 0
0.0%
 Liberal
Presbyterian
 0
0.0%
 1
100%
 0
0.0%
 0
0.0%
 0
0.0%
 0
0.0%
 0
0.0%
 Presbyterian
(other not specified
mainline)
 10
5.6%
 28
15.8%
 37
20.9%
 51
28,8%
 39
22.0%
 11
6.2%
 1
0.6%
 Mainline
Presbyterian
 5
4.9%
 17
16.5%
 12
11.7%
 32
31.1%
 28
27.2%
 9
8.7%
 0
0.0%

Well, maybe the most important thing about this table is a demonstration of the nature and limitations of surveys.  The first item is the statistics of small numbers.  This dataset works well for the largest denominations, but below the level of the PCA one would like to see a bigger sample.  The second is the self reporting of affiliations and it leaves me wondering if the two different mainline but unspecified categories should be folded into the PC(USA), ignored, or treated as their own group?  And what to do with our liberal and conservative friends?

However, taking the numbers at face value and using the indicators suggested above the only listed Presbyterian branch where we would expect growth is the EPC and the OPC is pretty close.  It is interesting to see the PCA numbers in the same ballpark as the PC(USA).

OK, bottom line — While I need to modify or qualify my attendance calculations from yesterday, the conclusion remains pretty much in tact.  The difference between growing and declining congregations is not in getting Christmas and Easter members to church a couple more times a year (although that would be good) but in fostering an environment were religious faith and participation is taken seriously.

On to the next data set – PC(USA) amendment voting.  Stay tuned.

National Council Of Churches Membership Data — We Can Correlate That

This past Monday the National Council of Churches USA announced the release of their 2011 Yearbook, a press release that traditionally includes the membership data for the 25 largest denominations in the country.

My first reaction, after a quick look at the data, was “nothing new here — move along to something else.”

My second thought was “why don’t I just take that part of the post from last year, copy and paste it for this year, strike out the old numbers and fill in the new ones.”  In all honesty, the two sets of numbers look a lot alike and I was wondering if there was anything new worth saying about it.

Well, I finally came to my senses, remembered that my motto is “I never met a data set I didn’t like,” and on my commute home I thought about what I could do with it.  I then spent my lunch hours the rest of the week crunching data.  Yup, that’s the way I roll.

Now, a couple of years ago I correlated the NCC data against surveys about political opinions and found that for the mainline churches the degree of membership decline correlated with stronger liberal political opinions.  But, based on reading I have done in the last couple of years I have modified this hypothesis and now think that part of the problem of decline is not the political opinions of the churches per se, but rather that the problem is a lack of clear and well defined beliefs and expectations, particularly in the mainline.  That is to say that trying to be too broad in doctrine leaves those looking for a church uncertain about that church and no need to be committed to anything in particular.  It is the hot and cold of Laodicea and shown on a small scale by the division of the Londonderry Presbyterian Church which split and, at least when I wrote about it a year and a half ago, the combined membership of the two churches had nearly doubled over what it was before the split.  (Now, when I get the the end of this post I won’t necessarily have proven that thesis, but I think it will support it.)

Now, to give credit where credit is due, this is not something I pulled out of thin air but, as I said, saw in the studies and essays I was reading.  Prominent among these, in the chronological order in which I read them: Beau Weston, Rebuilding the Presbyterian Establishment; Dean Hoge, Donald Luidens, and Benton Johnson, Vanishing Boundaries; Bradley Wright, Christians are hate-filled hypocrites… and other myths you’ve been told;  and Kenda Creasy Dean, Almost Christian.

So, I set about seeing if I could find correlations between indicators of strength of faith and the NCC data.  Thanks to Brad Wright’s book I knew that the Religious Landscape Survey by the Pew Forum On Religion & Public Life was a wealth of information.  The data is split into two reports, the Religious Affiliation Report (full report ) and the Religious Beliefs and Practices Report (full report ).  To tinker a little more, I downloaded The U.S. Religious Landscape Survey dataset from the Pew Research Center.  The resulting analysis and data manipulation is mine and it should be kept in mind that “The Pew Research Center and the Pew Forum on Religion & Public Life bear no responsibility for the analyses or interpretations of the data presented here.”  OK, you got the required disclaimer.

I was fun look at the raw data because there are some interesting details in there although they are generally not related to this present discussion.  For example, of the 480 participants who identified themselves with the Assemblies of God, 420 said there was a heaven but 432 said there was a hell.  While that may say something interesting about the theology, in fairness I would have trouble with the wording of the study’s questions because they were base on merit, that is if someone led “a good life” and not on Christ’s free gift of eternal life.  Since individuals could self-identify the denomination they were with it is interesting to note that there is one who said Emerging Church, one each who identified as Liberal Presbyterian and Conservative Presbyterian.  But my favorite has to be the two individuals who identified themselves as an Electronic Ministries Baptist and Electronic Ministries Pentecostal.  Can I now call myself a Virtual Ministries Presbyterian?  We will have to wait to see when the Open Source Church appears.   I am going to keep playing with the dataset and see what other interesting details I can find.

Anyway, some additional interpretation details: The survey was conducted in 2007 so technically a bit of a time offset from the 2010 NCC data.  In addition, the data package comes with one database for the continental U.S. and another for Alaska and Hawai’i.  I only number-crunched the former which contains a bit over 35000 records.  For the first set of correlations with the demographic data I have taken the numbers from Appendix 3 of the Religious Affiliation report which lists results as percentages with no decimal places.  Results for religious behavior that I calculated from the provided dataset are reported as percentages with one decimal place.  And for those interested in trying it themselves at home, the data is provided in SPSS format which you can also read with the open source package PSPP.  I will talk about correlation coefficients which test only for a linear correlation and the data is supplied with a weighting scheme designed to reflect reliability, which I did not use for this initial exploration.

For the NCC data, of the 25 churches on the list only 14 provided numbers for membership change. Of these, we saw notable growth in the Church of Jesus Christ of Latter Day Saints (+1.42%), and significant growth in the Jehovah’s Witnesses (+4.37%) and the Seventh-Day Adventists (+4.31%).  There was small growth in the Roman Catholic church (+0.57%), the Assemblies of God (+0.52%), and the Church of God (Cleveland, Tenn.) (+0.38%).  The mainline/oldline churches had typical declines including the United Methodist Church (-1.01%), the Evangelical Lutheran Church in America (-1.96%), Presbyterian Church (U.S.A.) (-2.61%), the Episcopal Church (-2.48%), the American Baptist Churches in the U.S.A. (-1.55%), and the United Church of Christ (-2.83%).  Slightly smaller declines were experienced by more evangelical churches, such as the Southern Baptist Convention (-0.42%) and the Lutheran Church – Missouri Synod (-1.08%).

All of the correlations I ran are available in a web-published Google spreadsheet and the sheet also contains the 2009 membership changes and correlations with those as well.  For this discussion I only use the 2010 membership changes.  As always, use at your own risk.  For those who don’t regularly work with correlations a quick introduction: If the number is positive the correlation is direct and if negative it is inverse.  Correlation statistics range in absolute value from 1, which is perfect, to 0 (zero) when there is no correlation.  Values of 0.8 and greater are generally considered strong correlations and values below 0.5 have weak to no correlation and need to be looked at carefully.  Also, this analysis assumes that the correlation is linear and I have not run tests for leverage effects by extreme values. (But as you will see in the graphs below there are a pair of high values that usually cluster nicely.)

The first demographic data I looked at was for members’ marital status and there was little to no correlation between that and a denominations growth rate.  However, looking at the extremes of age distribution we find that growing churches have a higher percentage of younger members (18-20 years old) than declining members and the declining churches have more older members (>65) than growing churches.
 
These correlations are good with 0.77 and -0.78.  The question is whether there is a cause and effect relationship.  Are growing denominations growing because they have more young people, or are more young people there because they are growing.  We can probably safely conjecture that the relationship is complex and mutual and there is a bit of each going on probably establishing a positive or negative feedback loop.

The correlation with number of children is somewhat predictable based on this preceding relationship. While families with no children are more likely at declining churches (correlation -0.63), it surprised me that the strongest correlation in the children categories was the relationship of families with one child to be at growing churches (correlation 0.81) and then to have families with two children to be completely uncorrelated (-0.03).  The correlation returns with moderate strength for three children (0.63) and for four or more not quite as strong at (0.50).  Like above, assigning dependency is problematic and there is probably a complex relationship. (Maybe something to crunch the numbers around a bit for.)

There is one other demographic relationship and that has a moderate correlation — college grads are more common at declining denominations (correlation = -0.55).

Now, what about the idea I really wanted to test – that patterns of behavior and belief that indicate more intense or dedicated religious practice are correlated with denominational growth.  The survey provides us with several of these.

First, again taking a lead from Brad Wright’s book, I look at church attendance, as self-reported.  I have combined six categories down to three with the frequent attenders (once a week or more than once a week) in one group, the occasional (less than once a week but still multiple times a year) in the second group, and the seldom to none in the last group.

In the first two cases there is a strong correlation with the frequent attenders (weekly or better) to be members of growing denominations (correlation=0.76) and the less-than-weekly to be members of declining denominations (correlation=-0.82).  For the seldom to none, they are more likely in declining denominations, but the correlation is weaker (correlation=-0.40).  For comparison purposes, the Presbyterian Panel asks a similar question and found that for members 26% responded that they attend weekly and another 38% said they attended “nearly every week.”  That total of 64% is a bit higher than the 56.9% in the RLS data, but seems a reasonable match in light of the different wording of the questions.

The survey has two ways of looking at the importance of religion to the participants.  The first is a direct question if their religion is very important, somewhat important, not too important, or not at all important.  The percentages that answered very important and somewhat important are both well correlated with the growth/decline numbers, but in opposite senses.  For those who said their religion was very important there was a correlation of 0.74 indicating they are more likely to be in growing churches.  For those who answered somewhat important, the correlation is -0.74 and they are more likely in declining denominations.

The second is a question that asks “When it comes to questions of right and wrong, which of the following do you look to most for guidance.”  Of the four choices, two were substantially preferred by respondents.  “Guided by religious teachings and beliefs” is shown with the red squares in the graph below and has a 0.77 correlation with denominational growth.  On the chart you can see the outlier to the trend at 0.57% growth which is the data point for the Roman Catholic church.  Removing that data point the correlation jumps to a strong 0.83.  As you can see, the other strong answer is “Practical experience and common sense”, shown in green, and that has an inverse correlation at -0.77.  So in growing churches the members rely more on church teaching and in declining churches the members are guided more by their own experience.  It is interesting, and somewhat surprising to this scientist, how far below the first two the reliance on philosophy and on science fall.  And both of those have almost as strong inverse correlations.


 
You can have a look at the spreadsheet for a bunch of the other correlations I ran.  It pretty much holds up that strong religious beliefs, certainty in those beliefs, and practices correlate with denominational growth while the moderate to weak responses for these things are inversely correlated and are more likely in declining denominations.

Well, crunching the numbers is the easy part.  What does this all mean and can it be applied to reverse mainline decline?

First, let me say that I think it is difficult to separate what should be the neutral practices from the doctrine.  As I said, correlation coefficients for the relationship between beliefs and growth/decline are pretty much identical to correlations between practices and growth/decline.  To put it another way, at what point does regular weekly attendance at church change from being just a religious practice to being a matter of doctrine or belief?

Another tricky point here is that for most of the indicators measured, while the doctrinal ones may be teachings of the church, what the statistics show is not the effectiveness of the churches teachings directly, but the ethos of the church and the expectation for accepting those teachings.  In other words, almost every church would want a member to be guided by the church’s teachings to determine right and wrong, but the growing denominations pass along not just the teaching, but the expectation that members take it seriously.

Finally, it has to be remembered that a denomination is composed of particular churches and in most cases we are measuring one of these on the level of the individual member and the other on the level of the denomination.  Lost in the middle are the different congregations where this is actually implemented.

So by way of conclusion here are two things that surprised me in this analysis:

The first was the uniformity of the correlations.  Yes, there were some variations but in general there were a lot that fell in the 0.7 to 0.8 range or the -0.7 to -0.8 range.  This suggests to me that you should not be looking through this to find the “silver bullet.” Instead, these measures show broad patterns that probably reflect the overall nature of the denominations rather than where to improve on one or two specific practices.

The second thing that surprised me was how high the bar was.  In looking at this data we are not seeing the line between growing and declining as being in heresy or apostasy.  We are seeing the difference in whether members attend once a week or once a month.  We are seeing the difference in whether someone is certain or God, or fairly certain of God.

Now, I welcome you to stare at the data and draw your own conclusions.  My number one take-away is that “Being Christian” is not about what you do for one-hour on Sunday morning (OK, one and a half hours if the sermon goes long and you stay for a cup of coffee.)  Rather, it is about how you live your life the other 167 hours out of the week.  It is about whether that hour influences the other 167.  It is about how your Christian faith affects the rest of your life.  To me, these data show that the indicator of a growing denomination is a pattern of faithfulness in many areas of our lives.

Your mileage may vary.  OK, now what do I do with my lunch hour next week?

Technical note:  I think it is important to note that for questions with only two choices any correlations with a third variable will be of the same magnitude and opposite sign for the two choices.  For the Guidance question above, while there were four choices, the Philosophy option and the Science option were selected by so few respondents that there are effectively only two answers, the Religion option and the Experience option. That is not the case with the demographic graph since substantial numbers of respondents fell into the age ranges between these two end groups.  Combined, the two end members represent no more than 40% of the sampled population.

Strong Cross-Issue Correlation In PC(USA) Amendment Voting To Date

To give you fair warning right at the onset, this will be a fairly geeky post to go with the geeky title.  So let me begin with an executive summary for those that want to avoid the drill-down into the statistics.

Coming out of the 219th General Assembly of the Presbyterian Church (U.S.A.) in the summer of 2010 were three high-profile amendments to be voted on by the presbyteries:  addition of the Belhar Confession to the Book of Confessions, a new Form of Government section for the Book of Order, and Amendment 10-A which proposed new language for the “fidelity and chastity” section, G-6.0106b, of the Form of Government.  At the present time between thirty and fifty presbyteries have voted on each and the votes on each side are very evenly matched.  Furthermore, when you consider the relationship between votes on the different issues they are very strongly correlated.

While this is an interesting statistical result there are two practical implications of this.  The first is that if voting continues to follow the current trends and the correlation holds, the final votes on nFOG and 10-A will be very close but we can expect that the Belhar Confession will not be approved by the presbyteries since it requires a 2/3 vote for inclusion.  The second implication is the fact that presbyteries, and by that we really mean their commissioners, might see some sort of strong linkage between these three items.  It is not clear to what extent any particular factor generates a linkage, but potential reasons could be related to maintaining or rejecting the status quo, affinity group promotion of particular votes, and perception of the issues as all being promoted by the centralized institution of the denomination.

Got that?  OK, for the geeks, nerds and other curious readers here is where this comes from…

I am taking the correlations from my own tally sheet of the voting on these issues.  My spreadsheet is not original to me but represents an aggregation of data from posts on Twitter, and other vote sheets from the Layman, Covenant Network, Yes On Amendment A, and Reclaim Biblical Teaching.  It is important to note that only the first and last of those have info on all three issues and the other two are only for 10-A.

As of yesterday morning, the Belhar Confession was at 21 yes and 20 no, the nFOG was tied at 15, and 10-A was at 27 yes and 25 no.  In total, 88 presbyteries – just over half – had voted on one or more of the issues.  Of these 22 have voted on two of the issues — 9 on Belhar and nFOG, 7 on Belhar and 10-A, and 6 on nFOG and 10-A. Seven presbyteries have voted on all three issues, five of those voting no on all three and two voting no on two out of three with one voting yes on 10-A and one on nFOG.

I eventually plan to run correlations on voting ratios for those presbyteries that have recorded votes, but for this analysis I maximized the sample set by just looking at the bimodal yes/no outcome.  I have a master matrix which those familiar with statistics should be careful not to confuse as a joint probability chart since I have mixed the votes together.  (And I’m sorry if the 70’s color scheme annoys you, but it is just my working spreadsheet and not intended for final publication.)

So, here are the charted data:

 n=16  Belhar
Yes
 Belhar
No
 nFOG Yes  2  1
 nFog No  0  13

 n=14  Belhar
Yes
 Belhar
No
 10-A Yes  4  1
 10-A No  1  8

 n=12  10-A
Yes
 10-A
No
 nFOG Yes  4  1
 nFog No  1  6



Statistics of small numbers? Clearly. But I find it striking that so far only one presbytery has voted cross-wise on each combination except that no presbytery has yet voted no on nFOG and yes on Belhar.  I also think it is noteworthy that in each case, and most pronounced in the Belhar/nFOG voting, there are more presbyteries that have voted “no” on both than have voted “yes.”  For Belhar/10-A and 10-A/nFOG this goes away, and even reverses, if you take out the presbyteries that have voted on all three.

Looking at the bigger picture, while the total vote counts don’t provide any definitive correlation data, their very close margins at the present time are completely compatible with the interpretation that the votes are correlated.  In other words, if the votes are correlated very similar vote counts would be expected (which we have).  But this observation is only necessary and not sufficient for the interpretation.  Additionally, when vote counts are recorded there are usually very similar vote distributions for each of these issues, giving additional evidence of their correlation.

Calculating the number is the easy part, figuring out if it is meaningful is more difficult.  With less than 10% of the presbyteries actually represented in any of of these correlation charts at this point I firmly acknowledge that this could all easily change around very quickly.  So, I don’t want to over-interpret the data, but I do think some corresponding observations are in order.

The simplest explanation is that while the voting may be correlated they are not linked.  In this case a commissioner would make up his or her mind separately on each issue independently and without regard for the other two issues.  The result is that most commissioners, after weighing the arguments and reflecting on information, would be guided to vote the same way on each of the issues.  This is a likely conclusion, especially for those presbyteries that schedule the voting at three different meetings.

But even with our best efforts to be thoughtful and treat each issue independently I have observed a few things around the denomination that tend to link these issues together.  In some cases this is fairly prominent and in other cases I suspect the influence may be at a subliminal level.

The first possible effect is that affinity groups, by recommending the same votes on all three issues, are having an effect and providing a linkage, even if only implied.  Resources at Theology Matters and the Reclaim Biblical Teaching site of the Presbyterian Coalition both recommend a no vote on all three issues.  Similarly, the Covenant Network and Presbyterian Voices for Justice are in favor of all three actions — although to be fair, PJV voices are not unanimously in favor of nFOG.  What has been set up, rightly or wrongly, appears to be a “party-line” vote where you vote yes on the slate if you are progressive or liberal or vote no if you are conservative or orthodox.  This linkage of Belhar and 10-A has been floating around for a while.  It is tougher to tell if there are real linkages of these two with nFOG or whether they are not linked but rather appeal to the same theological base, or possibly whether the issue is “guilt by association.”  Maybe another linkage between nFOG and Belhar is not theological but logistical and some of the negative sentiment simply stems from the church not having had the time to discuss and explore them enough yet. Yes, quite possible despite the fact that we were supposed to be doing that with both issues for the last two years between assemblies.

Beyond the third-party recommendations, let me put forward more subtle explanations – inertia & cynicism.  This is somewhat related to the lack of familiarity argument above but more about the seven last words of the church – “We’ve never done it that way before.”  The question I have is how many presbytery commissioners are opposed to all of them because this seems like change for change’s sake?  Or how many are for it because the church needs to change?  Or to put it another way – “if it ain’t broke why are we trying to fix it?”  A similar argument against Belhar and nFOG could be “if it comes from Louisville it must not be good.”  Remember, neither of these finally came as a presbytery overture but as recommendations from GA entities. (The nFOG has been talked about for a while but the recommendation to form the Task Force was the result of a referral to the OGA.  The request to study the Belhar Confession came from the Advocacy Committee on Racial-Ethnic Concerns.)

Now let me be clear before I am set upon in the comments: For each of these amendments there are very good arguments for and against them and as presbytery commissioners we set about weighing these arguments and discerning God’s will together.  I would expect few if any commissioners would vote solely on the idea that “nothing good can come from Louisville.”  What I do expect is that for some individuals the preservation of the status quo and skepticism of proposals that are top-down rather than bottom-up from the presbyteries are important factors, explicitly or implicitly.

Well, I am afraid that I have gotten too close to the great quote from Mark Twain – “There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.”  Considering we are still in the early stages of the voting I may indeed be guilty of over interpreting the data.  So rather than provide more conjecture, let me ask a question that may be hinted at but not answerable by these data or even the final data set:  Are we doing our deliberations and voting a disservice by having so many high-profile votes in a single year?  To put it another way – Is our explicit or implicit linkages of issues, valid or not, unfairly influencing the votes?  Something to think about and keep probing the data for answers.

So, until next time, happy data crunching.